Summary
The user is facing an issue with Airbyte on Kubernetes where the disk storing logs is full. They are looking for tips on how to locate and manually delete the disk to resolve the issue.
Question
Hello, I’m running Airbyte on Kubernetes via Google Cloud Platform. My DevOps guy is on vacation and it stopped working. I’m pretty sure it’s because the disk storing logs is full, as that was the cause last time. Any tips or resources on how I can locate this in Kubernetes and do a manual delete of the disk? I’m comfortable using the CLI.
This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.
Join the conversation on Slack
["troubleshooting", "full-disk-issue", "airbyte", "kubernetes", "logs", "disk", "manual-delete", "cli"]
and I found the error logs “Cannot publish to S3: Storage backend has reached its minimum free drive threshold. Please delete a few objects to proceed.”
You can run kubectl exec -it airbyte-minio-0 -n airbyte-abctl --kubeconfig ~/.airbyte/abctl/abctl.kubeconfig -- /bin/bash
Which should land you in the minio container. In the /storage directory are the logs. You can go and remove logs in those directories. Be careful to avoid deleting the state-storage directory
<@U07C8CCC68Y> thanks! I was just looking through the /storage/airbyte-dev-logs/job-logging/workspace
directory. It seems like the logs are stored in hundreds of subdirectories. Is it safe to delete all of them?
I assume you were referring to not deleting /storage/airbyte-dev-logs/state
and /storage/state-storage
Great thanks! job-logging/workspace
has 20GB in it which is our disk size… so yeah. these seems to be the issue.
haha yeah that explains it! I you find yourself filling the minio volume regularly, you might want to consider externalizing your storage. The docs here: https://docs.airbyte.com/deploying-airbyte/integrations/storage show how to use either S3 or GCS.
Thanks! Good to know. You saved me today haha!