Source MixPanel - Syncs hanging / stuck for hours without any error messages

  • Is this your first time deploying Airbyte?: No
  • OS Version / Instance: Amazon Linux 2
  • Memory / Disk: t3.large EC2 instance – 8Gb memory / 30 Gb EBS volume
  • Deployment: Docker on EC2
  • Airbyte Version: v.0.39.9-alpha
  • Source name/version: Mixpanel / 0.1.20 and Postgres / 1.0.7
  • Destination name/version: S3 / 0.3.15
  • Step: The issue is happening during sync
  • Description:

A few of our syncs (Mixpanel → S3 and Postgres → S3) often get “stuck” / “hang” for several hours and need to be canceled but there’s nothing in the sync logs, server logs, or scheduler logs that indicate what is happening. How can I go about troubleshooting why these syncs tend to get stuck / hang? When these syncs run successfully they complete relatively quickly (our Mixpanel sync generally takes ~5 mins or less while our Postgres sync generally takes ~2 mins or less); however, lately we’ve noticed that sometimes they’re stuck in the “Running” state for 4+ hours with no error messages logged and our only option is to cancel them and try re-running them later.

Note: all of our syncs are triggered via Dagster.

Edit: After canceling then retrying the syncs, the Postgres sync completed in ~2 mins but the Mixpanel sync is still hanging – specifically it’s hanging on the export stream:

2022-09-09 16:33:58 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
errors: $.compression_codec: is missing but it is required, $.format_type: does not have a value in the enumeration [Avro]
2022-09-09 16:33:58 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
errors: $.compression_type: does not have a value in the enumeration [No Compression]
2022-09-09 16:33:58 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/182/0/logs.log
2022-09-09 16:33:58 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.9-alpha
2022-09-09 16:33:58 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-mixpanel:0.1.20 exists...
2022-09-09 16:33:58 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-mixpanel:0.1.20 was found locally.
2022-09-09 16:33:58 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 182
2022-09-09 16:33:58 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/182/0 --log-driver none --name source-mixpanel-check-182-0-imqzi --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/source-mixpanel:0.1.20 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.9-alpha -e WORKER_JOB_ID=182 airbyte/source-mixpanel:0.1.20 check --config source_config.json
2022-09-09 16:33:59 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):97 - Check succeeded
2022-09-09 16:34:00 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
2022-09-09 16:34:00 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/182/0/logs.log
2022-09-09 16:34:00 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.9-alpha
2022-09-09 16:34:00 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-s3:0.3.15 exists...
2022-09-09 16:34:00 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-s3:0.3.15 was found locally.
2022-09-09 16:34:00 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 182
2022-09-09 16:34:00 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/182/0 --log-driver none --name destination-s3-check-182-0-iskxi --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/destination-s3:0.3.15 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.9-alpha -e WORKER_JOB_ID=182 airbyte/destination-s3:0.3.15 check --config source_config.json
2022-09-09 16:34:03 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:03 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json}
2022-09-09 16:34:03 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:03 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):104 - Running integration: io.airbyte.integrations.destination.s3.S3Destination
2022-09-09 16:34:03 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:03 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):105 - Command: CHECK
2022-09-09 16:34:03 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:03 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):106 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-09-09 16:34:04 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:04 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:04 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:04 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:04 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:04 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:04 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:04 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword requires - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:04 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:04 e[32mINFOe[m i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"flattening":"No flattening","compression":{"compression_type":"GZIP"},"format_type":"CSV"}
2022-09-09 16:34:04 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:04 e[32mINFOe[m i.a.i.d.s.S3DestinationConfig(createS3Client):190 - Creating S3 client...
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):102 - Storage Object bu-dagster-data-production-000065427292/airbyte_output/mixpanel does not exist in bucket; creating...
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):104 - Storage Object bu-dagster-data-production-000065427292/airbyte_output/mixpanel has been created in bucket.
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3Destination(testIAMUserHasListObjectPermission):156 - Started testing if IAM user can call listObjects on the destination bucket
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3Destination(testIAMUserHasListObjectPermission):159 - Finished checking for listObjects permission
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3Destination(testSingleUpload):81 - Started testing if all required credentials assigned to user for single file uploading
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3Destination(testSingleUpload):91 - Finished checking for normal upload mode
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m i.a.i.d.s.S3Destination(testMultipartUpload):95 - Started testing if all required credentials assigned to user for multipart upload
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1662741246756 with full ID m47mYRAPDc6.jN2eMzJnU0byzpI7SSPTZhV76T75IN8CN.GolqipKBsHgQF8tXhR5BIuKuz9BnPgU8BA.JbHGQXqcNUAy2QoFmNjX.y4I_GxcBACuOTftRZlGcg9Xw75
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[33mWARNe[m a.m.s.MultiPartOutputStream(close):160 - [MultipartOutputStream for parts 1 - 10000] is already closed
2022-09-09 16:34:06 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:06 e[32mINFOe[m a.m.s.StreamTransferManager(complete):367 - [Manager uploading to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1662741246756 with id m47mYRAPD...lGcg9Xw75]: Uploading leftover stream [Part number 1 containing 3.34 MB]
2022-09-09 16:34:07 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:07 e[32mINFOe[m a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1662741246756 with id m47mYRAPD...lGcg9Xw75]: Finished uploading [Part number 1 containing 3.34 MB]
2022-09-09 16:34:07 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:07 e[32mINFOe[m a.m.s.StreamTransferManager(complete):395 - [Manager uploading to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1662741246756 with id m47mYRAPD...lGcg9Xw75]: Completed
2022-09-09 16:34:07 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2022-09-09 16:34:07 e[32mINFOe[m i.a.i.d.s.S3Destination(testMultipartUpload):119 - Finished verification for multipart upload mode
2022-09-09 16:34:07 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
2022-09-09 16:34:07 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/182/0/logs.log
2022-09-09 16:34:07 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: 0.39.9-alpha
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(run):115 - start sync worker. job id: 182 attempt id: 0
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(run):127 - configured sync modes: {null.engage=full_refresh - append, null.export=incremental - append}
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination...
2022-09-09 16:34:08 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-s3:0.3.15 exists...
2022-09-09 16:34:08 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-s3:0.3.15 was found locally.
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 182
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/182/0 --log-driver none --name destination-s3-write-182-0-jvrud --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/destination-s3:0.3.15 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.9-alpha -e WORKER_JOB_ID=182 airbyte/destination-s3:0.3.15 write --config destination_config.json --catalog destination_catalog.json
2022-09-09 16:34:08 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/source-mixpanel:0.1.20 exists...
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 182
2022-09-09 16:34:08 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):82 - airbyte/source-mixpanel:0.1.20 was found locally.
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/182/0 --log-driver none --name source-mixpanel-read-182-0-ssdak --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e WORKER_CONNECTOR_IMAGE=airbyte/source-mixpanel:0.1.20 -e WORKER_JOB_ATTEMPT=0 -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e AIRBYTE_VERSION=0.39.9-alpha -e WORKER_JOB_ID=182 airbyte/source-mixpanel:0.1.20 read --config source_config.json --catalog source_catalog.json --state input_state.json
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):395 - Destination output thread started.
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(run):169 - Waiting for source and destination threads to complete.
2022-09-09 16:34:08 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):298 - Replication thread started.
2022-09-09 16:34:09 e[44msourcee[0m > Starting syncing SourceMixpanel
2022-09-09 16:34:09 e[44msourcee[0m > Using start_date: 2021-10-01, end_date: 2022-09-09
2022-09-09 16:34:09 e[44msourcee[0m > Syncing stream: engage 
2022-09-09 16:34:12 e[43mdestinatione[0m > 2022-09-09 16:34:12 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-09-09 16:34:12 e[43mdestinatione[0m > 2022-09-09 16:34:12 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):104 - Running integration: io.airbyte.integrations.destination.s3.S3Destination
2022-09-09 16:34:12 e[43mdestinatione[0m > 2022-09-09 16:34:12 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):105 - Command: WRITE
2022-09-09 16:34:12 e[43mdestinatione[0m > 2022-09-09 16:34:12 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):106 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-09-09 16:34:13 e[43mdestinatione[0m > 2022-09-09 16:34:13 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:13 e[43mdestinatione[0m > 2022-09-09 16:34:13 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:13 e[43mdestinatione[0m > 2022-09-09 16:34:13 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:13 e[43mdestinatione[0m > 2022-09-09 16:34:13 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword requires - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-09 16:34:13 e[43mdestinatione[0m > 2022-09-09 16:34:13 e[32mINFOe[m i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"flattening":"No flattening","compression":{"compression_type":"GZIP"},"format_type":"CSV"}
2022-09-09 16:34:13 e[43mdestinatione[0m > 2022-09-09 16:34:13 e[32mINFOe[m i.a.i.d.s.S3DestinationConfig(createS3Client):190 - Creating S3 client...
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.s.SerializedBufferFactory(getCreateFunction):49 - S3 format config: S3CsvFormatConfig{flattening=NO, compression=GZIP}
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$toWriteConfig$0):83 - Write config: WriteConfig{streamName=engage, namespace=null, outputBucketPath=airbyte_output/mixpanel, pathFormat=airbyte_output/mixpanel/${STREAM_NAME}/airbyte_sync_dt=${YEAR}-${MONTH}-${DAY}/${EPOCH}_, fullOutputPath=airbyte_output/mixpanel/engage/airbyte_sync_dt=2022-09-09/1662741253378_, syncMode=append}
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$toWriteConfig$0):83 - Write config: WriteConfig{streamName=export, namespace=null, outputBucketPath=airbyte_output/mixpanel, pathFormat=airbyte_output/mixpanel/${STREAM_NAME}/airbyte_sync_dt=${YEAR}-${MONTH}-${DAY}/${EPOCH}_, fullOutputPath=airbyte_output/mixpanel/export/airbyte_sync_dt=2022-09-09/1662741253378_, syncMode=append}
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$onStartFunction$1):90 - Preparing bucket in destination started for 2 streams
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$onStartFunction$1):104 - Preparing storage area in destination completed.
2022-09-09 16:34:14 e[43mdestinatione[0m > 2022-09-09 16:34:14 e[32mINFOe[m i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):47 - Starting a new buffer for stream engage (current state: 0 bytes in 0 buffers)
2022-09-09 16:35:13 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):344 - Records read: 1000 (417 KB)
2022-09-09 16:36:15 e[44msourcee[0m > Read 1870 records from engage stream
2022-09-09 16:36:15 e[44msourcee[0m > Finished syncing engage
2022-09-09 16:36:15 e[44msourcee[0m > SourceMixpanel runtimes:
Syncing stream engage 0:02:05.512180
2022-09-09 16:36:15 e[44msourcee[0m > Syncing stream: export 

Hello,

Could you try upgrading to a newer version of Airbyte to see if that resolves the issue? We are currently at v0.40.4.

Please let us know if you are still running into these hanging syncs after upgrading. Thank you!

The issue persists even after upgrading to the latest version (v0.40.6).

The issue primarily happens with the source-mixpanel connector – specifically for the export stream.

We are also ingesting the engage stream and that stream seems to be read successfully but when the connection starts the read for the export stream that’s when it hangs.

In the logs, I see the following message and then no activity after that:

"Syncing stream engage 0:02:04.154987
2022-09-14 15:25:33 e[44msourcee[0m > Syncing stream: export
"

2022-09-14 15:23:17 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
errors: $.compression_codec: is missing but it is required, $.format_type: does not have a value in the enumeration [Avro]
2022-09-14 15:23:17 e[32mINFOe[m i.a.v.j.JsonSchemaValidator(test):71 - JSON schema validation failed. 
errors: $.compression_type: does not have a value in the enumeration [No Compression]
2022-09-14 15:23:17 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):106 - Docker volume job log path: /tmp/workspace/200/0/logs.log
2022-09-14 15:23:17 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):111 - Executing worker wrapper. Airbyte version: 0.40.6
2022-09-14 15:23:17 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:17 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
2022-09-14 15:23:17 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:17 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-mixpanel:0.1.21 exists...
2022-09-14 15:23:17 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-mixpanel:0.1.21 not found locally. Attempting to pull the image...
2022-09-14 15:23:21 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - Pulled airbyte/source-mixpanel:0.1.21 from remote.
2022-09-14 15:23:21 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-mixpanel-check-200-0-umkec with resources io.airbyte.config.ResourceRequirements@485e9910[cpuRequest=<null>,cpuLimit=<null>,memoryRequest=<null>,memoryLimit=<null>]
2022-09-14 15:23:21 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/200/0 --log-driver none --name source-mixpanel-check-200-0-umkec --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-mixpanel:0.1.21 -e AIRBYTE_VERSION=0.40.6 -e WORKER_JOB_ID=200 airbyte/source-mixpanel:0.1.21 check --config source_config.json
2022-09-14 15:23:23 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(internalLog):103 - Check succeeded
2022-09-14 15:23:23 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):132 - Stopping cancellation check scheduling...
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:23 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):106 - Docker volume job log path: /tmp/workspace/200/0/logs.log
2022-09-14 15:23:23 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):111 - Executing worker wrapper. Airbyte version: 0.40.6
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-s3:0.3.15 exists...
2022-09-14 15:23:23 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-s3:0.3.15 was found locally.
2022-09-14 15:23:23 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = destination-s3-check-200-0-cworf with resources io.airbyte.config.ResourceRequirements@485e9910[cpuRequest=<null>,cpuLimit=<null>,memoryRequest=<null>,memoryLimit=<null>]
2022-09-14 15:23:23 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/200/0 --log-driver none --name destination-s3-check-200-0-cworf --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-s3:0.3.15 -e AIRBYTE_VERSION=0.40.6 -e WORKER_JOB_ID=200 airbyte/destination-s3:0.3.15 check --config source_config.json
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json}
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):104 - Running integration: io.airbyte.integrations.destination.s3.S3Destination
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):105 - Command: CHECK
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):106 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword requires - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[32mINFOe[m i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"flattening":"No flattening","compression":{"compression_type":"GZIP"},"format_type":"CSV"}
2022-09-14 15:23:25 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:25 e[32mINFOe[m i.a.i.d.s.S3DestinationConfig(createS3Client):190 - Creating S3 client...
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):102 - Storage Object bu-dagster-data-production-000065427292/airbyte_output/mixpanel does not exist in bucket; creating...
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3StorageOperations(createBucketObjectIfNotExists):104 - Storage Object bu-dagster-data-production-000065427292/airbyte_output/mixpanel has been created in bucket.
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3Destination(testIAMUserHasListObjectPermission):156 - Started testing if IAM user can call listObjects on the destination bucket
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3Destination(testIAMUserHasListObjectPermission):159 - Finished checking for listObjects permission
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3Destination(testSingleUpload):81 - Started testing if all required credentials assigned to user for single file uploading
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3Destination(testSingleUpload):91 - Finished checking for normal upload mode
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m i.a.i.d.s.S3Destination(testMultipartUpload):95 - Started testing if all required credentials assigned to user for multipart upload
2022-09-14 15:23:26 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:26 e[32mINFOe[m a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - Initiated multipart upload to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1663169006899 with full ID tQ9fTj922xLRT30ogD2_IJk4tLsewMLjlfZQ6.LuLpm54pstiYCBngMrv9P1UhtAJOtLKe6s23ZFxq1izumGyIQzAC8lm9Ynzz8pJ2K_48PjcStLrHPRLt2eg.65K1Mb
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m a.m.s.MultiPartOutputStream(close):158 - Called close() on [MultipartOutputStream for parts 1 - 10000]
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[33mWARNe[m a.m.s.MultiPartOutputStream(close):160 - [MultipartOutputStream for parts 1 - 10000] is already closed
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m a.m.s.StreamTransferManager(complete):367 - [Manager uploading to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1663169006899 with id tQ9fTj922...eg.65K1Mb]: Uploading leftover stream [Part number 1 containing 3.34 MB]
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m a.m.s.StreamTransferManager(uploadStreamPart):558 - [Manager uploading to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1663169006899 with id tQ9fTj922...eg.65K1Mb]: Finished uploading [Part number 1 containing 3.34 MB]
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m a.m.s.StreamTransferManager(complete):395 - [Manager uploading to bu-dagster-data-production-000065427292/airbyte_output/mixpanel/test_1663169006899 with id tQ9fTj922...eg.65K1Mb]: Completed
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m i.a.i.d.s.S3Destination(testMultipartUpload):119 - Finished verification for multipart upload mode
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):62 - 2022-09-14 15:23:27 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):152 - Completed integration: io.airbyte.integrations.destination.s3.S3Destination
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):132 - Stopping cancellation check scheduling...
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):106 - Docker volume job log path: /tmp/workspace/200/0/logs.log
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.t.TemporalAttemptExecution(get):111 - Executing worker wrapper. Airbyte version: 0.40.6
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):49 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.EnvConfigs(getEnvOrDefault):1096 - Using default value for environment variable METRIC_CLIENT: ''
2022-09-14 15:23:27 e[33mWARNe[m i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to 
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):49 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(run):122 - start sync worker. job id: 200 attempt id: 0
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - ----- START REPLICATION -----
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - 
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(run):135 - configured sync modes: {null.engage=full_refresh - append, null.export=incremental - append}
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.i.DefaultAirbyteDestination(start):69 - Running destination...
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-s3:0.3.15 exists...
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-s3:0.3.15 was found locally.
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = destination-s3-write-200-0-ymtcx with resources io.airbyte.config.ResourceRequirements@79291ed2[cpuRequest=<null>,cpuLimit=<null>,memoryRequest=<null>,memoryLimit=<null>]
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/200/0 --log-driver none --name destination-s3-write-200-0-ymtcx --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-s3:0.3.15 -e AIRBYTE_VERSION=0.40.6 -e WORKER_JOB_ID=200 airbyte/destination-s3:0.3.15 write --config destination_config.json --catalog destination_catalog.json
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-mixpanel:0.1.21 exists...
2022-09-14 15:23:27 e[32mINFOe[m i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-mixpanel:0.1.21 was found locally.
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-mixpanel-read-200-0-fjplu with resources io.airbyte.config.ResourceRequirements@73d8b166[cpuRequest=<null>,cpuLimit=<null>,memoryRequest=<null>,memoryLimit=<null>]
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/200/0 --log-driver none --name source-mixpanel-read-200-0-fjplu --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-mixpanel:0.1.21 -e AIRBYTE_VERSION=0.40.6 -e WORKER_JOB_ID=200 airbyte/source-mixpanel:0.1.21 read --config source_config.json --catalog source_catalog.json --state input_state.json
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(run):177 - Waiting for source and destination threads to complete.
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):435 - Destination output thread started.
2022-09-14 15:23:27 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):322 - Replication thread started.
2022-09-14 15:23:28 e[44msourcee[0m > Starting syncing SourceMixpanel
2022-09-14 15:23:29 e[44msourcee[0m > Using start_date: 2021-10-01, end_date: 2022-09-14
2022-09-14 15:23:29 e[44msourcee[0m > Syncing stream: engage 
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[32mINFOe[m i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):104 - Running integration: io.airbyte.integrations.destination.s3.S3Destination
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):105 - Command: WRITE
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[32mINFOe[m i.a.i.b.IntegrationRunner(runInternal):106 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[33mWARNe[m c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword requires - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[32mINFOe[m i.a.i.d.s.S3FormatConfigs(getS3FormatConfig):22 - S3 format config: {"flattening":"No flattening","compression":{"compression_type":"GZIP"},"format_type":"CSV"}
2022-09-14 15:23:30 e[43mdestinatione[0m > 2022-09-14 15:23:30 e[32mINFOe[m i.a.i.d.s.S3DestinationConfig(createS3Client):190 - Creating S3 client...
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.s.SerializedBufferFactory(getCreateFunction):49 - S3 format config: S3CsvFormatConfig{flattening=NO, compression=GZIP}
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$toWriteConfig$0):83 - Write config: WriteConfig{streamName=engage, namespace=null, outputBucketPath=airbyte_output/mixpanel, pathFormat=airbyte_output/mixpanel/${STREAM_NAME}/airbyte_sync_dt=${YEAR}-${MONTH}-${DAY}/${EPOCH}_, fullOutputPath=airbyte_output/mixpanel/engage/airbyte_sync_dt=2022-09-14/1663169010337_, syncMode=append}
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$toWriteConfig$0):83 - Write config: WriteConfig{streamName=export, namespace=null, outputBucketPath=airbyte_output/mixpanel, pathFormat=airbyte_output/mixpanel/${STREAM_NAME}/airbyte_sync_dt=${YEAR}-${MONTH}-${DAY}/${EPOCH}_, fullOutputPath=airbyte_output/mixpanel/export/airbyte_sync_dt=2022-09-14/1663169010337_, syncMode=append}
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.b.BufferedStreamConsumer(startTracked):116 - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$onStartFunction$1):90 - Preparing bucket in destination started for 2 streams
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.s.S3ConsumerFactory(lambda$onStartFunction$1):104 - Preparing storage area in destination completed.
2022-09-14 15:23:31 e[43mdestinatione[0m > 2022-09-14 15:23:31 e[32mINFOe[m i.a.i.d.r.SerializedBufferingStrategy(lambda$addRecord$0):47 - Starting a new buffer for stream engage (current state: 0 bytes in 0 buffers)
2022-09-14 15:24:32 e[32mINFOe[m i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):352 - Records read: 1000 (420 KB)
2022-09-14 15:25:33 e[44msourcee[0m > Read 1897 records from engage stream
2022-09-14 15:25:33 e[44msourcee[0m > Finished syncing engage
2022-09-14 15:25:33 e[44msourcee[0m > SourceMixpanel runtimes:
Syncing stream engage 0:02:04.154987
2022-09-14 15:25:33 e[44msourcee[0m > Syncing stream: export 

Hi @pmehta,

Thanks for your post. Can you upgrade to the latest version of the connector and report back with if this is still an issue?
It seems like the Export is a stream with dynamically generated schemas and the API calls for generating schema can fail for un-paid account(s).

May solve your issue, it may not but it’s worth giving it a try. Please report back with an update!