Zendesk Chat and Zendesk Support freezes after some hours

  • Is this your first time deploying Airbyte: No
  • OS Version / Instance: Ubuntu 20.04
  • Memory / Disk: 8Gb / 100 GB ssd
  • Deployment: Docker
  • Airbyte Version: 0.35.61-alpha
  • Source name/version: airbyte/source-zendesk-chat:0.1.6 (latest)
  • Destination name/version: airbyte/destination-bigquery: 1.0.5 (latest)
  • Step: normal sync.
  • Description: everytime the process starts to sync it freezes after some hours, the same happens with source-zendesk-support:0.2.3 (latest).

logs-252 zendesk support.txt (64.1 KB)

logs-253 zendesk chat.txt (428.5 KB)

Are you using staging for BigQuery destination or the normal insert?

GCS staging, 5mb block

I tried with streaming inserts and it freezes again. @marcosmarxm

now i’m having this error in zendesk chat 0.1.6

This is more a warning than an error per se sorry about it, the sync finished successfuly or failed? if failed please upload the logs

failed with no erros, it just got stuck

Can you show the output from docker stats? Is it possible to scale up the memory of your server?

Updated everything to last version, now running only Zendesk support

  • souce: source-zendesk-support:0.2.5
  • destination: destination-bigquery:1.1.1 (NO GCS STAGING)
  • Memory / CPU / Disk: 16Gb / 4 cores /100 GB ssd
  • Airbyte Version: 0.36.0-alpha

it freezes again, do you think a docker system prune can solve this?

I don’t see from your docker stats that you have a problem of memory/disk/io. The sync always get stuck in the same step/records?

it’s random, but only happens with Zendesk, i have a google ads with a lot of data running normal, i tried Zendesk again without GCS staging but it freezes.

Having a similar issue here with Zendesk getting stuck while syncing ticket_metrics

2 Likes

It seems that it managed to produce a hopefully useful log now.
logs-253.txt (79.2 KB)

@jablonskijakub your problem looks to Redshift side, please upgrade Airbyte to latest OR downagrade the version of Redshift to 0.3.28

hey @marcosmarxm. it worked to sync the results without normalisation. an interesting fact is that it stopped at 100K rows though…any idea why? In previous tries which failed it exceeded this number so I think it was closed forcibly?
logs-276.txt (59.4 KB)

It’s not clear @jablonskijakub from logs. If you use a start date close to today or a smaller stream to ingest the sync completes?

do you think we can add some logging in this file?

Yes it’s possible and you can build the devversion to test it locally. Do you need assistance for that?