How to debug connections

Hi there :slightly_smiling_face:
Im having a problem and I will like to debug it to understand whats going on.
I builded a custom connector (source)
Then I created my connection from that source to Local CSV destination and it works!
On my second custom connector(source)
I did the same steps but now I get an error on writing to the destination:


at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
	at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:164) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
	at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
	at java.lang.Thread.run(Thread.java:833) [?:?]
	Suppressed: io.airbyte.workers.WorkerException: Destination process exit with code 1. This warning is normal if the job was cancelled.
		at io.airbyte.workers.protocols.airbyte.DefaultAirbyteDestination.close(DefaultAirbyteDestination.java:119) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
		at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:126) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
		at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
		at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
		at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: io.airbyte.workers.DefaultReplicationWorker$DestinationException: Destination process exited with non-zero exit code 1
	at io.airbyte.workers.DefaultReplicationWorker.lambda$getDestinationOutputRunnable$6(DefaultReplicationWorker.java:354) ~[io.airbyte-airbyte-workers-0.36.3-alpha.jar:?]
	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
	... 1 more

So is there a way to debug locally a connection? Thanks!

Hey @theArg,
To make sure your source connectors read work you can run it locally on your connector container:
docker run --rm -i airbyte/source-<your-source>:dev read --config airbyte-integrations/connectors/source-<your-source>/secrets/config.json --catalog airbyte-integrations/connectors/source-<your-source>/integration_tests/catalog.json

Let me know what this command outputs :pray:

Hi @alafanechere,
The source works as expected but when I create a connection from my second custom source to LocalCSV, there is when it fails.

By running the read command I already see the records being emitted, Im interested on debugging a connection (source โ†’ destination) to see why he is not able to write.

Could you please share your full sync logs and your catalog + configured catalog?

catalog.json

{
  "streams": [
    {
      "stream": {
        "name": "jrct",
        "json_schema": {
          "$schema": "http://json-schema.org/draft-07/schema#",
          "title": "JRCT clinical trials",
          "type": "object",
          "supported_sync_modes": [
            "full_refresh"
          ],
          "properties": {
            "raw": {
              "type": "string"
            }
          }
        }
      },
      "sync_mode": "full_refresh",
      "destination_sync_mode": "overwrite"
    }
  ]
}

Sync logs

logs-50.txt (65.2 KB)

Your connector outputs records on a stream called raw which is not declared in the catalog.
It correspond to the following error in your logs:
Exception in thread "main" java.lang.IllegalArgumentException: Message contained record from a stream that was not in the catalog.

You need to declare all your streamโ€™s schema in your catalog.json

I had the wrong name of the stream in the discover. Now its working, many thanks!