Running Airbyte in E-commerce Analytics Stack with dbt, Airflow, and BigQuery

Summary

Clarification on running Airbyte and obtaining workspace ID in E-commerce Analytics Stack setup with dbt, Airflow, and BigQuery


Question

Hey y’all I’m looking through the docs here: <https://github.com/airbytehq/quickstarts/tree/main/airbyte_dbt_airflow_bigquery|E-commerce Analytics Stack with Airbyte, dbt, Airflow (ADA) and BigQuery> and it’s not entirely clear to me when/where you are supposed to run Airbyte and get the workspace ID



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.

Join the conversation on Slack

["airbyte", "e-commerce-analytics-stack", "dbt", "airflow", "bigquery", "workspace-id"]

Also this looks incomplete: https://github.com/airbytehq/quickstarts/blob/main/airbyte_dbt_airflow_bigquery/orchestration/README.md

> We’ve downloaded the official docker-compose.yaml file provided by Airflow and adapted it to:
> • Use some configurations from an .env file
> • Add the Airbyte operator, dbt and astronomer-cosmos packages
> • Mount our dbt project folder into the container image
> • For running locally, we’ve set up the network to use the one deployed by the Airbyte container setup (from <https://docs.airbyte.com/deploying-airbyte/local-deployment|Airbyte Local Deployment>)
> • Admitting you’re