Question: Can I use API to export Local Extraction data?

Answer: No. But you can back up your local data to Cloud first, then use API to export the data.


Question: I got a ”500 Internal Server Error“ when requesting a token, what should I do?

Answer: Please check if you have put your email address as the username in the request body, you need to change it to your username.


Question: Why do I get the ”403 Not Authorized“ error?

Answer: It appears when requesting an API server that your account does not have permission. If your account is on the Standard Plan and you send a request to Start a task, which requires a Professional account, you will receive this error. Please check the API permissions before requesting.


Question: Why do I get the error "429 Limit Times"?

Answer: Octoparse limits API usage to 20 requests per second. Please reduce the frequency if you receive the status code ‘429’.


Question: Why do I always get the same data when using Export non-exported data in API?

Answer: You need to use Mark as exported to mark your unexported data before you export the next batch of data.


Question: Why do I get the same batch of data when using export data by offset?

Answer: You need to use the offset returned in the API response to get the next batch of data rows.

For example, when the returned offset is 100, but you are still using offset=0, offset=10, then the data rows you get would be the same batch.


Question: How can I get the next 1000 rows using the Export non-exported data API? My data amount is more than 1000 rows.

Answer: If you have exported the first batch of non-exported data, please use Mark as exported to mark this batch data. Then you can get the next 1000 rows by using Export non-exported data, and so forth.


Question: Why does it only let me export 1000 rows when using Get data by offset? How can I get all the rows?

Answer: One API request can only export 1000 rows. You need to use several API requests to get all the data. For example, in the first request, you use offset=0 and get the first 1000 rows. In the second request, you need to use offset=1000(could be larger than 1000, you can get this offset from the response of the first request) to get the next 1000 data rows, and so on and so forth.


Question: Why is the number of offset returns quite large? I did not request so many data lines.

Answer: This is because some data rows scraped in previous runs have been deleted, and the starting offset is not 0 now. For example, if you got 100 data rows when you ran the task for the first time, then these 100 data rows start with offset=0. If you delete this batch of 100 data rows, run the same task again and get 100 new data lines. Now the 100 data rows start with offset=100 as the data rows with offset from 0-100 have been deleted.


Question: Can I create additional crawlers programmatically with the API and extract them with the API?

Answer: For the time being, Octoparse API only allows you to retrieve extracted data, get task information and control tasks/crawlers (start/stop extraction). Whenever you need to create/configure a crawler with Octoparse, you have to do it on the task configuration interface within the app.

Did this answer your question?