The boring permit infrastructure your stack actually needs.
We do the unglamorous work — 3,000+ city portals, broken PDFs, and flaky municipal APIs — so you ship one clean REST endpoint. Real-time webhooks. Daily refresh. Zero scraping.
No credit card · 1,000 free calls/mo · 5-minute setup
curl https://api.permitflow.dev/v1/permits \
-H "Authorization: Bearer pf_live_..." \
-G \
-d city=phoenix \
-d type=solar \
-d since=2025-04-01[
{
"permit_number": "BLD2025-04123",
"city": "Phoenix",
"zip": "85016",
"address": "2847 E Camelback Rd",
"type": "Solar",
"status": "Issued",
"valuation": 24500,
"contractor": {
"name": "SunPro Solar of Arizona",
"license": "ROC-289341"
},
"issued_date": "2025-04-22"
}
]Built for the teams hunting construction signal
Every permit, normalized
We crawl 3,000+ city and county permit portals (Accela, Tyler, Socrata, custom PDFs) and ship one consistent schema. Stop maintaining 47 scrapers.
Daily refresh + webhooks
New solar permit issued in 85016? Your webhook fires within minutes. Built for lead-gen, not for batch ETL the next morning.
Contractor licenses joined
Every contractor license cross-referenced against 50 state license boards. Verify ROC numbers, see active vs. expired, track new entrants.
ROI math
One closed solar deal pays for a year.
A permit filed today is a homeowner who already wants to spend $30k. Solar installers, HVAC techs, and roofers spend $500–$5,000/mo on permit feeds — because the lead-to-close math is instant and obvious.
vs Building it yourself
You could scrape 3,000 city portals. You shouldn't.
Every PropTech team starts with "we'll just build a scraper." Six months later they're maintaining 47 of them, debugging Accela timeouts at 2am, and parsing Tyler Munis PDFs by hand. Skip that arc.
DIY scraping
- 1–2 engineers full-time on portal maintenance
- Breaks every time a city updates their website
- No standard schema across jurisdictions
- PDF parsing for the 30% of cities without an API
- Legal risk when ToS changes mid-quarter
- Engineering budget burned on plumbing, not product
Real cost: $200k+/yr in engineering, slow delivery, fragile data
One API, done
- 5-minute integration, normalized JSON across all cities
- We monitor every portal — you never see a broken scraper
- Real-time webhooks (<5 min lag) instead of nightly batches
- Contractor licenses cross-referenced automatically
- ToS-clean: we operate the data infrastructure, you consume it
- Engineers ship product, not plumbing
Real cost: from $99/mo. Free tier to start.
Stop scraping. Start shipping.
Try the live Phoenix dataset right now — no signup.