Scrape
The scrape endpoint submits scraping jobs to ScrapeRouter. Each request targets a URL using a specific scraper and optional proxy configuration. On this page we cover the request model and how to create a synchronous scrape request.
Request Schema
Attributes of the scrape request object (ScrapeRequestSchema).
| Attribute | Type | Description | Default |
|---|---|---|---|
url
required
|
string (URL) | — | |
method
|
string |
GET
|
|
headers
|
object | — | |
query
|
array | object | — | |
data
|
any | Request body; body type is auto-detected (JSON, form, or raw). | — |
cookies
|
array | object | — | |
timeout_ms
|
integer | number | — | |
browser_type
|
string | — | |
headless
|
boolean | — | |
wait_for_selector
|
string | — | |
wait_for_timeout_ms
|
integer | number | — | |
wait_for_load_state
|
string | — | |
page_actions
|
array | — | |
screenshot
|
boolean | — | |
scraper
|
string |
auto
|
|
scraper_options
|
object |
{}
|
|
proxy
|
any | Proxy config object or proxy type string: datacenter, residential, mobile |
datacenter
|
scraperouter
|
any | — |
Response Schema
Attributes of the scrape response object (ScrapeResponseSchema).
| Attribute | Type | Description | Default |
|---|---|---|---|
id
|
uuid | Scrape.id | — |
status_code
|
integer | — | |
final_url
|
string | — | |
headers
|
object |
{}
|
|
content
|
string | — | |
cookies
|
array | object | — | |
errors
|
array | — | |
screenshot_url
|
string | — | |
scraper_data
|
object | — | |
scraperouter
|
any | — |
Create a scrape request
POST
/api/v1/scrape/
Creates a new scraping request and returns the result synchronously.
Required attributes
| Parameter | Description |
|---|---|
url
|
The URL to scrape |
scraper
|
Scraper to use ("auto" for automatic selection)
|
Optional attributes
| Parameter | Description |
|---|---|
method
|
HTTP method. Default: "GET"
|
headers
|
Custom request headers |
proxy
|
Proxy type or config. Default: "datacenter"
|
Request
#!/usr/bin/env bash
curl -X POST https://www.scraperouter.com/api/v1/scrape/ \
-H "Authorization: Api-Key {your_api_key}" \
-H "Content-Type: application/json" \
-d '{
"url": "https://example.com",
"scraper": "auto",
"proxy": "datacenter"
}'
import requests
response = requests.post(
"https://www.scraperouter.com/api/v1/scrape/",
headers={"Authorization": "Api-Key {your_api_key}"},
json={
"url": "https://example.com",
"scraper": "auto",
"proxy": "datacenter",
},
)
data = response.json()
const response = await fetch("https://www.scraperouter.com/api/v1/scrape/", {
method: "POST",
headers: {
"Authorization": "Api-Key {your_api_key}",
"Content-Type": "application/json",
},
body: JSON.stringify({
url: "https://example.com",
scraper: "auto",
proxy: "datacenter",
}),
});
const data = await response.json();
Response
{
"id": "550e8400-e29b-41d4-a716-446655440000",
"status_code": 200,
"url": "https://example.com",
"content": "<!doctype html>...",
"headers": {
"content-type": "text/html; charset=UTF-8"
},
"scraper": "requests/requests_2_32"
}