New to Airbyte. Testing it out as a solution to gathering inventory data across online clothing retailers.
I am following the tutorial found here for automating data scraping: https://airbyte.com/tutorials/data-scraping-with-airflow-and-beautiful-soup
I had no issues up until adding in the custom DAG to airflow, specifically Step 4. I was able to successfully set up the
airbyte_linkedin_connection connection. I then went to add the python code to the
dags/airbyte_airflow_dag.py file under the
airflow directory. Firstly, the provided code doesn’t actually compile after replacing the variables so it required a little python formatting. Once I got the script working, I saved the file and restarted the webserver and scheduler as instructed. However, when I went to my DAGs on Airflow, I was unable to locate the one I set up. I can’t for the life of me find out why my webserver isn’t picking up on the new DAG. Any help would be greatly appreciated!