Summary
After deploying Airbyte in Kubernetes via Helm chart and storing output to Azure storage account container, user is missing logs and state. Looking for help or troubleshooting documentation.
Question
Hello, I have recently deployed Airbyte community version 1.0.0 in kubernetes via helm chart. After tinkering with some template values, I was able to get the deployment working and storing some output to Azure storage account container, for some reason I am still missing logs and state(?). There is workload/output
, so I can confirm that airbyte is in fact able to save stuff to the Azure storage container.
I also checked logs in all the pods and was not able to find any errors that would indicate that there is something wrong with the config / access. Would anyone be able to help me out / point to some troubleshooting documentation
This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.
Join the conversation on Slack
["airbyte", "kubernetes", "helm-chart", "azure-storage", "logs", "state", "troubleshooting"]
What do the global.storage.*
properties look like in your values.yaml
?
type: azure
storageSecretName: "airbyte-storage-config"
# Azure
bucket: ## Azure Blob Storage container names that you've created. We recommend storing the following all in one bucket.
log: airbyte-bucket
state: airbyte-bucket
workloadOutput: airbyte-bucket
azure:
connectionStringSecretKey: connectionString```
Although that did not compile correctly so I had to manually edit the compiled templates
I have updated the helm repository, now when i run
helm template airbyte airbyte/airbyte -f ./values.yaml > template.yaml
I get
Error: YAML parse error on airbyte/charts/server/templates/deployment.yaml: error converting YAML to JSON: yaml: line 250: could not find expected ':'
okay, looking into the templates again, seems like storageSecretName
breaks the template and needs to be replaced with secretName
yep, you’ve got it . . . that should fix the error you were seeing. shouldn’t change the other outcomes, but I would see if maybe there was a bug here that got fixed and whether the logs/state start showing up as a sync runs