Examples
Some sites behave differently a few miles apart. That matters if you're checking local inventory, delivery windows, search rankings, or location-specific pricing.
curl -X POST "https://api.scraperouter.com/request" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"url": "https://example.com/store-locator",
"geo": {
"country": "US",
"city": "Chicago"
}
}'
import requests
resp = requests.post(
"https://api.scraperouter.com/request",
headers={"Authorization": "Bearer YOUR_API_KEY"},
json={
"url": "https://example.com/product/123",
"geo": {
"country": "DE",
"city": "Berlin"
}
},
timeout=60,
)
print(resp.status_code)
print(resp.text[:500])
Typical output differences by city: - Retail: same product page, different stock status and store pickup options - Food delivery: same homepage, different restaurants and fees - Travel: same route search, different offers or currency handling - Search / classifieds: same query, different ranking and local listings
Practical tips
- Use city-level routing only when the target actually varies by city. If country-level gets you the same page, don't pay extra complexity for nothing.
- Verify the output, not just the IP metadata: check store IDs, delivery estimates, ZIP prompts, local banners, and structured data in the HTML.
- Expect lower reliability than broad geo targeting: the more specific the location, the smaller the proxy pool usually is.
- Keep a fallback path: city -> region -> country, so jobs don't fail just because one location is temporarily thin.
- Pin test URLs for each market and compare key fields over time: price, availability, ranking position, delivery ETA.
- Don't confuse city-level routing with exact GPS spoofing. Most scraping setups can influence apparent network location, not a user's precise device coordinates.
- If you're running this at scale, spread requests across time and sessions. Hammering one city too hard is an easy way to burn good routes.
A simple fallback pattern:
def build_geo_options(country, city=None, region=None):
options = []
if city:
options.append({"country": country, "city": city})
if region:
options.append({"country": country, "region": region})
options.append({"country": country})
return options
With ScrapeRouter, this is the kind of thing you want the router layer handling. Not because city routing is magic, but because maintaining provider-by-provider geo quirks yourself gets old fast.
Use cases
- Local inventory checks: See whether a product is available for pickup in one city but not another.
- Price monitoring: Compare city-specific pricing, taxes, fees, or promotions on marketplaces and delivery apps.
- SERP tracking: Measure how search results or map-like listings change between cities.
- Compliance and ad verification: Confirm that users in a specific city see the right disclaimer, offer, or campaign.
- Marketplace intelligence: Track how availability, sellers, or featured listings vary across local markets.
- Delivery coverage testing: Check whether an address in one city gets same-day delivery while another does not.
This comes up a lot in production because teams start with country-level routing, assume that's enough, and then wonder why the data still looks wrong. Sometimes the site is local in ways that only show up when the request really looks city-local.