Deploying on Kubernetes: Configuring custom S3 logs

  • Is this your first time deploying Airbyte?: Yes
  • OS Version / Instance: MacOS
  • Deployment: Kubernetes
  • Step: Just setting up Airbyte
  • Description: I have been following these directions for configuring logs using a S3 bucket: On Kubernetes (Beta) | Airbyte Documentation
    However, my S3 bucket doesn’t seem to be populating with logs

Hey @soccerbro421, thanks for your post and welcome to the community! Could you describe what your cluster setup looks like? Is it local or did you use something EKS?

Secondly are the s3 credentials configured for both read and write access as mentioned in the guide? Finally, were you able to successfully launch a local airbyte instance?

Hey @sajarin , Thanks for the response!

I’m trying to use EKS to deploy and more specifically following eks blueprints:

https://aws-quickstart.github.io/cdk-eks-blueprints/

Yes, I created and tested a IAM user and made sure I was able to update the S3 bucket as the user.

Yes I am successfully able to launch a local airbyte instance.

Another interesting note, is that when I customize my /kube/overlays/stable/.env file to set my S3 bucket for logs, I cannot seem to access my source for data (which is also an S3 bucket).

But when I leave it to the default settings, I am able to access that S3 bucket (the data source) just fine.

Should S3_LOG_BUCKET be just the bucket name or something like s3://<bucket_name> or the arn?

So if I understand you correctly, you have an s3 bucket set up for logs and you also have another s3 bucket with some data that you want to move somewhere else? Did I get that right?
My guess is that something is not configured properly with the s3 bucket you set up for your logs. As a result, it ends up writing the logs to the local minio instance instead of your s3 bucket. Can you check if this is what is actually happening by accessing the logs in minio?

And yes, I believe S3_LOG_BUCKET is just the name of the bucket, not the s3 url or the ARN.

Yup that’s right.

Did you use kustomize or helm chart deployment?

I used Kustomize deployment

I tried to setup a kustomize using external s3 and faced the same issue. I opened a ticket in Github https://github.com/airbytehq/airbyte/issues/14576 for further investigation. Any update I’ll return to you.

Hi folks, any update on this issue? We’re running into the same problem where our destination is S3 which is causing this error.

Hi folks, any update on this issue? We’re running into the same problem where our destination is S3 which is causing this error.

Did you check the latest update in Kuberentes deployment documenqation?

Thanks Marco. Is there a specific place you’re pointing to in the Kubernetes deployment documentation. I’ve been reviewing that page once a week and haven’t seen any changes regarding “configuring custom s3 log location”.

Unless you’re referring to the known issue of “File sources reading from and file destinations writing to local mounts are not supported on Kubernetes.”. Wasn’t sure if that was the same issue as this.