- Is this your first time deploying Airbyte?: No
- OS Version / Instance: MacOS
- Memory / Disk: you can use something like 4Gb / 1 Tb
- Deployment: I’m using airbyte cloud
- Airbyte Version: What version are you using now?
- Source name/version: google ads
- Destination name/version: s3
- Step: sync
- Description:
Hi support, I’m trying to set up s3 source and connection through airbyte api, the source and destination are created fine, but I ran into an issue when the connection is syncing data for the first time, below is the error from the log:
File "/airbyte/integration_code/source_s3/source_files_abstract/source.py", line 61, in check_connection
stream = self.stream_class(**config)
TypeError: __init__() got an unexpected keyword argument 'updated_at'
this is what the source looks like:
{
"sourceDefinitionId": "source_Definition_id",
"sourceId": "source_id",
"workspaceId": "workspace_id",
"connectionConfiguration": {
"format": {
"encoding": "utf8",
"filetype": "csv",
"delimiter": ",",
"block_size": 10000,
"quote_char": "\"",
"double_quote": true,
"infer_datatypes": true,
"newlines_in_values": false
},
"dataset": "dataset_name",
"provider": {
"bucket": "bucket_name",
"aws_access_key_id": "**********",
"aws_secret_access_key": "**********"
},
"path_pattern": "**.csv",
"updated_at": "2023-01-28 00:30:53.544"
},
"name": "123456:s3_",
"sourceName": "S3"
}
I understand this updated_at
parameter is not part of standard s3 connection configuration, but my system has been creating sources for google ads/facebook with the exact same updated_at
parameter which is also not part of the standard google ads/facebook source connection configurations and the syncing never had an issue with this, what’s special about s3 source that I can’t add additional parameters? Can I add additional parameters to s3 sources other than the standard ones? If so can you should me where I should put them? TIA!