Querying DWH (Snowflake) as part of source connector for awkward API with expiring IDs


The user is looking for a way to query a Data Warehouse (Snowflake) as part of the source connector to apply logic for an awkward API with expiring IDs. They are considering using SQLAlchemy but are unsure how to access the credentials for the new source.


Hello, I am working on creating a connector for an awkward API which requires an ID per customer that expires and is aggressively rate limited. As such, I don’t believe I can use the Parent Stream option as the expiration date would not be available, Is there a way to query a DWH (snowflake) as part of the source connector to apply logic to retrieve the list? I’m guessing I could just use SQLAlchemy but how can I access the credentials as its not the existing source?

This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.

Join the conversation on Slack

["querying-dwh", "snowflake", "source-connector", "awkward-api", "expiring-ids", "sqlalchemy", "credentials"]

Hmm, I don’t know enough to give you a confident answer yet. However, based on the information available, it seems you are trying to create a custom connector that interacts with an API and a data warehouse (Snowflake) simultaneously. In the Airbyte’s Python CDK documentation, it is mentioned that you can create your own data connector, but it doesn’t provide specific details about querying a data warehouse as part of the source connector. For accessing credentials, it’s recommended not to hardcode things like API keys or passwords - these should be passed in as user input when configuring a Source based on your connector. You might find it helpful to look at the example of a database connector built in Python shared by a support agent on the Airbyte forum. For more specific guidance, I would recommend checking out the Airbyte documentation or asking for help on the Airbyte forum.