Examples
A few cases where city-level routing matters:
- Local search results: food delivery, classifieds, events, local services
- Inventory checks: store pickup availability often changes by city
- Price verification: travel, rentals, and local offers can shift based on location
- Ad and SERP monitoring: geo-targeted campaigns are usually city-sensitive, not just country-sensitive
Example request with a city target:
curl -X POST https://www.scraperouter.com/api/v1/scrape/ \
-H "Authorization: Api-Key $api_key" \
-H "Content-Type: application/json" \
-d '{
"url": "https://example.com/search?q=coffee",
"country": "US",
"city": "Chicago"
}'
Typical response shape from a scraper pipeline might include the final routing metadata:
{
"url": "https://example.com/search?q=coffee",
"country": "US",
"city": "Chicago",
"status": 200,
"proxy_type": "residential"
}
Practical tips
- Use city-level routing only when the target actually behaves differently by city. If the site only cares about country, adding city targeting just burns more money and makes routing harder.
- Pair it with session stickiness when a workflow spans multiple requests: login, search, pagination, checkout simulation.
- Expect lower supply and higher cost than country-level routing, especially for smaller cities.
- Validate the result, don’t just trust the proxy label: check page content, store IDs, currency, pickup options, or localized modules.
- Have a fallback path: exact city first, then nearby metro, then country-level. Production systems need graceful degradation.
- Watch block rates by city. Some cities are noisier because everybody keeps hammering the same proxy pools.
A simple pattern is:
payload = {
"url": "https://example.com/store-locator",
"country": "US",
"city": "Austin"
}
# If Austin inventory data keeps failing, fall back to country-level or a nearby city.
What usually goes wrong in production:
- assuming city routing is precise when the provider really means metro area
- mixing city targeting with datacenter IPs and expecting local-user behavior
- not checking whether the site geolocates by IP, GPS prompts, account profile, or cookie history
Use cases
- Retail scraping: compare in-stock items and pickup availability across cities
- Travel monitoring: check hotel, rental, or flight pages from the user location the site actually cares about
- Local SEO tracking: see what a search or directory page looks like in one city versus another
- Marketplace intelligence: inspect listings, delivery options, fees, or ranking differences by city
- Ad verification: confirm that regionally targeted creatives and landing pages are actually showing where they should
For ScrapeRouter, this is the kind of thing a router layer is supposed to handle. You don’t want application code full of proxy-provider quirks just because one target needs Chicago and another one only needs US-wide coverage.