wait, let me make sure this has the ability to write to s3 for the external logs, and not just read hah
you want to check the values that are set in the airbyte-worker
Was that error you pasted from the airbyte-server or worker?
I think it’s because i’m putting prefixes in the values for things like S3_LOG_BUCKET
yea, that sounds like it might bet it
ok I think we’re super close. We’re now at a runtime failure for setting up a source. Here’s the error:
['Traceback (most recent call last):\n File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/file_based/availability_strategy/default_file_based_availability_strategy.py", line 81, in _check_list_files\n file = next(iter(stream.get_files()))\nStopIteration\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/file_based/availability_strategy/default_file_based_availability_strategy.py", line 62, in check_availability_and_parsability\n file = self._check_list_files(stream)\n File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/file_based/availability_strategy/default_file_based_availability_strategy.py", line 83, in _check_list_files\n raise CheckAvailabilityError(FileBasedSourceError.EMPTY_STREAM, stream=stream.name)\nairbyte_cdk.sources.file_based.exceptions.CheckAvailabilityError: No files were identified in the stream. This may be because there are no files in the specified container, or because your glob patterns did not match any files. Please verify that your source contains files last modified after the start_date and that your glob patterns are not overly strict. Contact Support if you need assistance.\nstream=cx partnerhero agent resources\n']```
Hey Hassan, could you share which fields under global.database
(in your helm values) you are specifying (keys only)
and I’m receiving that when try to set up a connection to google drive
nope, that’s not it, it has full s3 access
does it container /
?