Data Sync Issue with Airbyte from Redshift to Kafka


The user is facing an issue where Airbyte is not transferring updated data from Redshift to Kafka as expected. Despite changes in the data count at the source, the sync process does not capture and transfer the updates.


Hi, I have setup Airbyte with Redshift as source and Kafka as destination and have scheduled the sync to run every hour
On the first run, it transferred all the data as expected …Data count was 17000…Post which every hour sync runs …however when the data count at the source changes in case of an insert or update, we are expecting Airbyte should figure the changes and transfer the updates to the destination…In my case the count is increased to 17200…so i am expecting the 200 row of data to be transfer…but it doesnt happens…It stills says
Sync Succeeded 0 Bytes | no records extracted | no records loaded
What should be done to make the sync work properly ?

This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.

Join the conversation on Slack

["data-sync", "airbyte", "redshift-connector", "kafka-connector", "data-transfer"]

Was this setup with incremental replication?

<@U021P0L72CV> PFA Connection details

Can you check the state in settings and ensure it was captured?

<@U021P0L72CV> PFA State details

<@U021P0L72CV> Do you see anything abnormal or misconfiguration here ?

I only use Redshift as a destination, but the state looks odd… Thats an alphanumeric value for the cursor of transaction_id?

<@U021P0L72CV> If you talking about the data value of txn_id, the value looks something like this 51499781612823815913
Its a varchar(255) Primary key

<@U021P0L72CV> Ok i got your point…I see some values in txn_id which are alpha numeric and some junk values as well which might be causing an issue here… fgf being one of them

Yep if not numeric or date cursor I’m not sure how airbyte knows where to increment