Updating the source schema spec via the API
NOTE: also when you change the Replication frequency
in the UI, it also seems to overwrite the manually set schema via the API
What’s worse is that reset
will actually create the final table with the original schema, so even if you create a new schema via the API. the final tables in snowflake are incorrect so there will be a normalization error from SCD to final
Daniel sorry to hear that, as far I remember it is not possible to change the schema using the API (sorry to not remember the reason though). I asked the team if that have changed in latest version.
Is that documented somewhere? Why does that endpoint exist then?
You can update the connection, but only the schema won’t persist.
That seems silly/misleading. Any plans to change that? I could’ve sworn users have been doing this for a while
how does one update the schema? and if not then this issue becomes more pressing
Is this your first time deploying Airbyte?: No
OS Version / Instance: Ubuntu
Memory / Disk: 48gb
Deployment: Docker
Airbyte Version: 0.40.1
Source name/version: MySQL 1.0.1
Destination name/version: Snowflake 0.38
Step: Normalization Failure
100038 (22018): Numeric value 'true' is not recognized,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@33b335ee[additionalProperties={attemptNumber=2, jobId…
Daniel I’m waiting team answer to this issue.
Confirming: today is only possible to update/use the schema in the discover schema operation
Hi @marcosmarxm , could you explain the steps needed to update the schema using discover schema api? Many thanks
Running the discover schema endpoint will run the discover
step in the connector. You can’t overwrite the values generated by this process.