1. Add your Postgres access

  1. In the Destinations tab, click on the “Add destination” button located on the top right of your screen. Then, select the Postgres option from the list of connectors.

  2. Click Next and you’ll be prompted to add your access: host, port, user, password, database name and target schema.

  3. Define your load method: append, upsertor overwrite.

  1. Decide about:
  • Enabling SSH Tunnel: When this is turned on, the system will route the connection through the configured SSH tunnel to reach the PostgreSQL database. The SSH tunnel is an intermediary machine required when the PostgreSQL is not directly accessible over the internet.

  • Enabling Interactive Tunnel Authentication: In some environments, logging into the SSH tunnel may require interactive authentication steps. Enabling this option allows the connection process to simulate such interactive behavior, which is sometimes necessary for the tunnel to be established successfully.

    We do not recommend enabling it initially. If your extraction fails, edit the source and activate this option to try again.

  • Enabling SSL: Activates SSL encryption for the database connection. This is required when the PostgreSQL server enforces secure connections using SSL certificates. It ensures encrypted communication between your system and the database, improving security.

  • Adding record metadata: If active, it will add metadata columns with timestamps related to the loading process. Read more about each column below.

  1. Click Next.

2. Select your catalog data to send

  1. The next step is letting us know which data you want to send to the database. Select the layer and then the desired table.

Tip: The table can be found more easily by typing its name.

  1. Define which column from the selected table should be used as primary key on the destination database.

  2. Click Next.

3. Configure your Postgres data destination

  1. Describe your destination for easy identification within your organization. You can inform things like what data it sends, to which team it belongs, etc.

  2. To define your Trigger, consider how often you want data to be sent to this destination. This decision usually depends on how frequently you need the new table data updated (every day, once a week, only at specific times, etc.).

  3. Click Done.

Check your new destination!

  1. Once completed, you’ll receive confirmation that your new destination is set up!

  2. You can view your new destination on the Destinations page. Now, for you to be able to see it on your Postgres database, you have to wait for the pipeline to run. You can monitor it on the Destinations page to see its execution and completion. If needed, manually trigger the pipeline by clicking on the refresh icon. Once executed, your data should be seen on Postgres.

If you encounter any issues, reach out to us via Slack, and we’ll gladly assist you!