This guide will walk you through how to manually and automatically export scraped data from Octoparse to your database (MySQL, SQL Server, PostgreSQL). We’ll also introduce Octoparse’s Auto-Export Tool (Database only).
Prerequisites
Before getting started, ensure you have:
A task with extracted data results (Auto-export only supports Cloud data)
A configured database table with fields to receive the data
Note:
Auto-export currently supports MySQL, SQL Server, PostgreSQL, and Google Sheets.
For other databases or platforms (e.g., custom websites), consider using Octoparse Open API or Zapier integration.
1. Manual Export to Database
This example uses MySQL, but the process is similar for other supported databases.
IP Whitelisting Note:
If your database requires IP whitelisting, ensure your device IP is allowed, as exports run locally from your machine.
Steps:
View Cloud Data for your target task.
Click Export Data > Select your database type > OK.
Enter your database connection details:
Click Test Connection to verify access.
Map Data Fields:
Click Export to complete the process.
2. Automatic Export to Database
To enable scheduled exports, configure the auto-export settings:
Under Automatic Export, click Set.
Configure the export interval (e.g., every 5 minutes).
Click Confirm to save.
Saving Database Settings for Reuse
After the first setup, you can save your database configuration for future use:
3. Using the Database Auto-Export Tool
Once auto-export is enabled, the Auto-Export Tool will launch, displaying all scheduled export details.
Key Features:
Important:
The tool must remain open for exports to continue.
Closing the tool or shutting down your computer stops the export process.
Tips:
✔ Access the tool anytime via Tools in the side navigation.
✔ Exports begin immediately after clicking Start.
✔ Only new/unexported data is transferred in each scheduled run.
2. The export will start immediately when you click the Start button.
3. Automatic export to the database only exports unexported data every time.