Sync Succeeded: 0 Bytes | no records | no records

  • Is this your first time deploying Airbyte?: Yes
  • OS Version / Instance: Ubuntu AWS EC2 Instance
  • Memory / Disk: 20 GB
  • Deployment: Docker
  • Airbyte Version: 0.40.17
  • Source name/version: Custom Python Source - From AWS Data Exchange
  • Destination name/version: Local CSV
  • Step: The issue is happening during sync.
  • Description: I have created a Custom Python Source for AWS Data Exchange API, the check, discover and read are implemented according to the given guidelines. But while syncing the connector, the job is succeeded, but no data is written to the destination. Only the column names: _airbyte_id, _airbyte_data, _airbyte_emitted_at are written to the Destination file. Below is the log:

Log Text:

2023-05-19 13:08:36 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/5/0/logs.log
2023-05-19 13:08:36 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: 0.40.17
2023-05-19 13:08:36 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):50 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
2023-05-19 13:08:36 INFO i.a.c.EnvConfigs(getEnvOrDefault):1079 - Using default value for environment variable METRIC_CLIENT: ''
2023-05-19 13:08:36 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to
2023-05-19 13:08:36 INFO i.a.c.f.EnvVariableFeatureFlags(getEnvOrDefault):50 - Using default value for environment variable LOG_CONNECTOR_MESSAGES: 'false'
2023-05-19 13:08:36 INFO i.a.w.g.DefaultReplicationWorker(run):125 - start sync worker. job id: 5 attempt id: 0
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 -
2023-05-19 13:08:36 INFO i.a.w.g.DefaultReplicationWorker(run):141 - configured sync modes: {null.EVA and Equity EVA by Industry=full_refresh - overwrite}
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START REPLICATION -----
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 -
2023-05-19 13:08:36 INFO i.a.w.i.DefaultAirbyteDestination(start):65 - Running destination...
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-csv:0.2.10 exists...
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-csv:0.2.10 was found locally.
2023-05-19 13:08:36 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = destination-csv-write-5-0-dmmsd with resources io.airbyte.config.ResourceRequirements@988a58a[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2023-05-19 13:08:36 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/5/0 --log-driver none --name destination-csv-write-5-0-dmmsd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/destination-csv:0.2.10 -e AIRBYTE_VERSION=0.40.17 -e WORKER_JOB_ID=5 airbyte/destination-csv:0.2.10 write --config destination_config.json --catalog destination_catalog.json
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-aws-data-exchange-api:dev exists...
2023-05-19 13:08:36 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-aws-data-exchange-api:dev was found locally.
2023-05-19 13:08:36 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-aws-data-exchange-api-read-5-0-satyx with resources io.airbyte.config.ResourceRequirements@4347654e[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2023-05-19 13:08:36 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/5/0 --log-driver none --name source-aws-data-exchange-api-read-5-0-satyx --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-aws-data-exchange-api:dev -e AIRBYTE_VERSION=0.40.17 -e WORKER_JOB_ID=5 airbyte/source-aws-data-exchange-api:dev read --config source_config.json --catalog source_catalog.json
2023-05-19 13:08:36 INFO i.a.w.g.DefaultReplicationWorker(run):185 - Waiting for source and destination threads to complete.
2023-05-19 13:08:36 INFO i.a.w.p.DockerProcessFactory(create):119 - Creating docker container = source-aws-data-exchange-api-read-5-0-satyx with resources io.airbyte.config.ResourceRequirements@4347654e[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=]
2023-05-19 13:08:36 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/5/0 --log-driver none --name source-aws-data-exchange-api-read-5-0-satyx --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e WORKER_JOB_ATTEMPT=0 -e WORKER_CONNECTOR_IMAGE=airbyte/source-aws-data-exchange-api:dev -e AIRBYTE_VERSION=0.40.17 -e WORKER_JOB_ID=5 airbyte/source-aws-data-exchange-api:dev read --config source_config.json --catalog source_catalog.json
2023-05-19 13:08:36 INFO i.a.w.g.DefaultReplicationWorker(run):185 - Waiting for source and destination threads to complete.
2023-05-19 13:08:36 INFO i.a.w.g.DefaultReplicationWorker(lambda$getDestinationOutputRunnable$7):454 - Destination output thread started.
2023-05-19 13:08:36 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):339 - Replication thread started.
2023-05-19 13:08:38 destination > 2023-05-19 13:08:38 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2023-05-19 13:08:38 destination > 2023-05-19 13:08:38 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.destination.csv.CsvDestination
2023-05-19 13:08:38 destination > 2023-05-19 13:08:38 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: WRITE
2023-05-19 13:08:38 destination > 2023-05-19 13:08:38 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2023-05-19 13:08:38 destination > 2023-05-19 13:08:38 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
2023-05-19 13:08:38 destination > 2023-05-19 13:08:38 INFO i.a.i.d.c.CsvDestination$CsvConsumer(<init>):140 - initializing consumer.
2023-05-19 13:08:39 source > {"type": "RECORD", "record": {"stream": **TOTAL DATA HERE**}}
2023-05-19 13:08:39 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):372 - Source has no more messages, closing connection.
2023-05-19 13:08:39 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):381 - Total records read: 0 (0 bytes)
2023-05-19 13:08:39 INFO i.a.w.g.DefaultReplicationWorker(run):190 - One of source or destination thread complete. Waiting on the other.
2023-05-19 13:08:39 destination > 2023-05-19 13:08:39 INFO i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):65 - Airbyte message consumer: succeeded.
2023-05-19 13:08:39 destination > 2023-05-19 13:08:39 INFO i.a.i.d.c.CsvDestination$CsvConsumer(close):179 - finalizing consumer