Summary
To monitor the worker activity in Airbyte, you can check the logs and status of the job. The user is experiencing a long-running job for a simple 1-day Shopify job orders.
Question
how can I see what is happening inside the worker? The job is running 5h now for a simple 1 day shopify job orders…
This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want
to access the original thread.
Join the conversation on Slack
["monitoring-worker-activity", "logs", "job-status", "long-running-job", "shopify"]
one thing that comes to my mind is special destination
https://docs.airbyte.com/integrations/destinations/e2e-test
another is more related to connector development
it is possible to run Docker container locally, build your custom Shopify version with extra logging
https://docs.airbyte.com/connector-development/tutorials/custom-python-connector/environment-setup
yeah but before I could see in the logs which slice is being processed etc. now all I can see is that a workload is claimed and that is it…
before what? before Airbyte upgrade? before connector upgrade?
Do you not see anything in the logs for the currently running job when viewed in the UI? (note, these can take a while to load).
Often these are noisy, but using the filters to look at, say, just the Source or Orchestrator might help you better see progress (or lack thereof). Sometimes you’ll end up seeing that it’s in a backoff cycle due to API limits or such so it’s “idle” but hasn’t failed
<@U035912NS77> unfortunatelly no, just that the workload is claimed
same happens with the completely new deployment losing the hope
this is what I run as per insturctions on EC2 server
sudo usermod -a -G docker ec2-user
sudo systemctl start docker
sudo systemctl enable docker
exit
ssh -i mdata.pem ec2-user@my-ip
curl -LsfS <https://get.airbyte.com> | bash -
abctl local install```