Using source and destination connectors without web app

  • Is this your first time deploying Airbyte?: Yes
  • OS Version / Instance: MacOS
  • Memory / Disk: 16Gb / 500Gb
  • Deployment: Docker
  • Airbyte Version: What version are you using now?
  • Source name/version: Google As
  • Destination name/version: Postgres
  • Step: sync
  • Description:


I’m trying to use the source and destination connectors without the web app. In my example, I’m taking data from Google Ads, using the Google Ads source, and into a local Postgres DB.

On the source side, everything works fine:
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests sigma-google-ads read --config /secrets/config.json --catalog /integration_tests/test_catalog.json > data.json

The data.json has all of the data I requested. My assumption was then, from reading the docs, that I could pipe the data.json to the destination connector and use it to send the data to postgres:
cat data.json | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/sample_files:/sample_files postgres-airbyte write --config /secrets/config.json --catalog /sample_files/configured_catalog.json

The connector creates the table in Postgres, but is empty.
I found a github issue similar to mine: No data is persisted when running source and destination when running connector image directly · Issue #7196 · airbytehq/airbyte · GitHub
But it seems to be open.
I’m sure I’m doing something wrong / stupid, but I’m a bit lost: what is the way to when run connector image directly, instead of through the app?


Hey could you comment over the issue so that team can respond there?

Sure. Added a comment on that issue.

for future reference, if anyone else is lost, I was being dumb and forgot the -i, for docker to read from stdin the piped stream of json:
cat data.json | docker run --rm -i -v $(pwd)/secrets:/secrets -v $(pwd)/sample_files:/sample_files postgres-airbyte write --config /secrets/config.json --catalog /sample_files/configured_catalog.json.

Now it works just fine, and get the data in postgres.