Snowflake source connector cannot connect with lowercase parameter?

  • Is this your first time deploying Airbyte?: Yes (prototyping)
  • OS Version / Instance: Windows 21H2
  • Memory / Disk: n/a
  • Deployment: Docker
  • Airbyte Version: 0.40.2
  • Source name/version: Snowflake 0.1.21
  • Destination name/version: Snowflake 0.4.36
  • Step: Refreshing source schema, running the sync

Hello! I’m prototyping Airbyte for our organization and I have a scenario where we have access to a Snowflake reader account with data delivered through views, and I’m looking to move it to our organization’s Snowflake instance. I’ve discovered that the vendor created a role called data_syncer in lowercase. It appears to me that the source connector is forcing all connection parameters to uppercase. The following error is returned when I test the connection:

The connection tests failed.
State code: 08001; Error code: 390189; Message: Role 'DATA_SYNCER' specified in the connect string does not exist or not authorized. Contact your local system administrator, or attempt to login with another role, e.g. PUBLIC.

When I enclose the role value in quotes to try to force the case, those quotes are passed to the JDBC connection string. (other params redacted)

The connection tests failed.
Could not connect with provided configuration. Error: Failed to get driver instance for jdbcUrl=jdbc:snowflake://<ACCOUNT_NAME>.snowflakecomputing.com/?role="data_syncer"&warehouse=<WAREHOUSE>&database=<DATABASE>&schema=<SCHEMA>&JDBC_QUERY_RESULT_FORMAT=JSON&CLIENT_SESSION_KEEP_ALIVE=true

Has anyone encountered this and found a workaround, or should I be submitting an issue on GitHub? I assume this could also be an issue with the destination connector, but all of the connection parameters I have there are uppercase values.

Hello there! You are receiving this message because none of your fellow community members has stepped in to respond to your topic post. (If you are a community member and you are reading this response, feel free to jump in if you have the answer!) As a result, the Community Assistance Team has been made aware of this topic and will be investigating and responding as quickly as possible.
Some important considerations that will help your to get your issue solved faster:

  • It is best to use our topic creation template; if you haven’t yet, we recommend posting a followup with the requested information. With that information the team will be able to more quickly search for similar issues with connectors and the platform and troubleshoot more quickly your specific question or problem.
  • Make sure to upload the complete log file; a common investigation roadblock is that sometimes the error for the issue happens well before the problem is surfaced to the user, and so having the tail of the log is less useful than having the whole log to scan through.
  • Be as descriptive and specific as possible; when investigating it is extremely valuable to know what steps were taken to encounter the issue, what version of connector / platform / Java / Python / docker / k8s was used, etc. The more context supplied, the quicker the investigation can start on your topic and the faster we can drive towards an answer.
  • We in the Community Assistance Team are glad you’ve made yourself part of our community, and we’ll do our best to answer your questions and resolve the problems as quickly as possible. Expect to hear from a specific team member as soon as possible.

Thank you for your time and attention.
Best,
The Community Assistance Team

Hey, thanks for reporting this. I’ve created a Github issue for the investigation: https://github.com/airbytehq/airbyte/issues/18146

Can you try using &quot; instead of the double quote in your config to see if that translates properly? i.e. &quot;data_syncer&quot;