POST /1/crawlers/{id}/urls/crawl
Crawls the specified URLs, extracts records from them, and adds them to the index.
If a crawl is currently running (the crawler's reindexing
property is true
),
the records are added to a temporary index.
Servers
- https://crawler.algolia.com/api
Path parameters
Name | Type | Required | Description |
---|---|---|---|
id |
String | Yes |
Crawler ID. |
Request headers
Name | Type | Required | Description |
---|---|---|---|
Content-Type |
String | Yes |
The media type of the request body.
Default value: "application/json" |
Request body fields
Name | Type | Required | Description |
---|---|---|---|
save |
Boolean | No |
Whether the specified URLs should be added to the |
urls[] |
Array | Yes |
URLs to crawl. |
How to start integrating
- Add HTTP Task to your workflow definition.
- Search for the API you want to integrate with and click on the name.
- This loads the API reference documentation and prepares the Http request settings.
- Click Test request to test run your request to the API and see the API's response.