Z-API as a data source
Bring your data from Z-API to your catalog.
1. Initial setup
The Z-API delivers messages through a webhook. This webhook must be captured by your own systems and the message should be forwarded to an Amazon SQS queue. The source will then read the messages from the queue.
Some important considerations when creating the SQS queue:
- Add the following tag to the queue:
nekt:Service=true
- On the
Access policy
section underDefine who can receive messages from the queue
selectOnly the specified AWS accounts, IAM users and roles
and add your AWS account ID (only numbers) inside the text box, as in the image below.
- Make notes of the queue name, as this will be needed when configuring the connector
Once this is done, you can move on to the next step.
2. Configure your Z-API connector
-
In the Sources tab, click on the “Add source” button located on the top right of your screen. Then, select the Z-API option from the list of connectors.
-
Click Next and you’ll be prompted to add the queue name.
- Add the queue name based on the Amazon SQS queue you created in the previous step.
-
Click Next.
3. Select your Z-API streams
-
The next step is letting us know which streams you want to bring. You can select entire groups of streams or only a subset of them.
Tip: The stream can be found more easily by typing its name.
-
Click Next.
4. Configure your Z-API data streams
- Customize how you want your data to appear in your catalog. Select a name for each table (which will contain the fetched data) and the type of sync.
- Layer: companies in the Growth plan can choose in which layer the tables with the extracted data will be placed.
- Table name: there’s only one stream avaialable, which correspond to the messages that will be extracted from SQS.
- Click Next.
5. Configure your Z-API data source
-
Describe your data source for easy identification within your organization. You can inform things like what data it brings, to which team it belongs, etc.
-
To define your Trigger, consider how often you want data to be extracted from this source. This decision usually depends on how frequently you need the new table data updated (every day, once a week, or only at specific times).
-
Optionally, you can define some additional settings (if available).
- Configure Delta Log Retention and determine for how log we should store old states of this table as it gets updated. Read more about this resource here.
Check your new source!
-
Click Next to finalize the setup. Once completed, you’ll receive confirmation that your new source is set up!
-
You can view your new source on the Sources page. Now, for you to be able to see it on your Catalog, you have to wait for the pipeline to run. You can now monitor it on the Sources page to see its execution and completion. If needed, manually trigger the pipeline by clicking on the refresh icon. Once executed, your new table will appear in the Catalog section.
If you encounter any issues, reach out to us via Slack, and we’ll gladly assist you!