Quickstart
This guide will walk you through making your first scraping request with ScrapeRouter.
1. Get your API key
Sign up for a ScrapeRouter account and create an API key from the Dashboard. You'll use this key to authenticate all API requests.
2. Make your first request
Send a POST request to the /api/v1/scrape/ endpoint with a target URL and scraper:
#!/usr/bin/env bash
curl -X POST https://www.scraperouter.com/api/v1/scrape/ \
-H "Authorization: Api-Key {your_api_key}" \
-H "Content-Type: application/json" \
-d '{
"url": "https://example.com",
"scraper": "auto"
}'
import requests
response = requests.post(
"https://www.scraperouter.com/api/v1/scrape/",
headers={"Authorization": "Api-Key {your_api_key}"},
json={
"url": "https://example.com",
"scraper": "auto",
},
)
print(response.json())
const response = await fetch("https://www.scraperouter.com/api/v1/scrape/", {
method: "POST",
headers: {
"Authorization": "Api-Key {your_api_key}",
"Content-Type": "application/json",
},
body: JSON.stringify({
url: "https://example.com",
scraper: "auto",
}),
});
const data = await response.json();
console.log(data);
3. Read the response
The API returns a JSON response with the scraped content:
{
"id": "550e8400-e29b-41d4-a716-446655440000",
"status_code": 200,
"url": "https://example.com",
"content": "<!doctype html>...",
"headers": {
"content-type": "text/html; charset=UTF-8"
},
"scraper": "requests/requests_2_32"
}
Next steps
- Learn about Authentication options
- Explore the Requests API for advanced options
- See available Scrapers