Connecting custom python destination to Quickbooks source for data sync

Summary

The user wants to connect a custom Python destination to a Quickbooks source for data synchronization. The destination should make a POST request to the respective endpoint to post the stream employee data to the Quickbooks destination account.


Question

I have custom python destination of quickbooks built, I want to connect it to my quickbooks source to sync the data. For example the main function of my destination should be that suppose user syncs up employee stream data from a diiferent quickbooks account, so then this employee stream data being passed to our quickbooks destination, our destiantion has to make a POST request to the respective endpoint and post the stream employee data to our quickbooks destination account

Is this even possible?



This topic has been created from a Slack thread to give it more visibility.
It will be on Read-Only mode here. Click here if you want to access the original thread.

Join the conversation on Slack

["custom-python-destination", "quickbooks-source", "data-sync", "post-request", "employee-stream"]

Based on the information provided in the extracts, it seems possible to create a custom Python destination for Quickbooks and connect it to a Quickbooks source to sync data. In the Creating a Python Destination (DuckDB) tutorial, it’s explained how to create a custom destination using Python. You can follow similar steps to create your custom Quickbooks destination. Once you’ve implemented your connector, you can test it using the methods described in the tutorial. For example, you can mock the data you’re receiving from the source connector and Airbyte with a simple JSON file, which serves as an example of data getting passed. Here’s an example of how you might test your connector: python python main.py spec python main.py check --config integration_tests/config.json python main.py discover --config integration_tests/config.json cat integration_tests/messages.jsonl| python main.py write --config integration_tests/config.json --catalog integration_tests/configured_catalog.json Remember to replace the paths and filenames with those relevant to your Quickbooks destination. Regarding your specific use case of syncing the employee stream data from a different Quickbooks account to your Quickbooks destination account, it seems feasible. You would need to implement the logic for making a POST request to the respective endpoint in your custom Quickbooks destination. However, please note that the specifics of how to implement this are not provided in the extracts. You might need to refer to the Quickbooks API documentation or other resources for details on how to make the POST request. Finally, once your custom Quickbooks destination is ready, you can connect it to your Quickbooks source using Airbyte. The Add a Destination guide provides steps on how to set up a destination in Airbyte. Please note that this is a high-level overview and the actual implementation might require additional steps or considerations based on your specific requirements and the Quickbooks API.

Sounds doable. Anything blocking you in particular?

I’m currently stuck at how to make the POST requests for each of the stream of the employee information that is coming from the stream.

<@U04SE163WC8> from my source when I get multiple json outputs in my destination, I want to be able to go over each json response and make a post request to send it somewhere else. How do I configure this behaviour in my python destination? Would I need to modify the write function?

Based on the information provided in the extracts, you would indeed need to modify the write function in your Python destination to achieve this behavior. In the write function, you would iterate over the input_messages, which are the data records coming from your source. Each input_message is an AirbyteMessage and can be of different types. The actual data records are of the type Type.RECORD. Here is a simplified version of the code from the tutorial: python for message in input_messages: if message.type == Type.RECORD: data = message.record.data # add your code here to make a POST request with the data In the place of the comment, you would add your code to make a POST request with the data. The data variable contains the JSON data from your source. Please note that this is a simplified version of the code and you might need to handle other message types and other aspects depending on your specific use case. Also, please be aware that making a POST request for each record might not be the most efficient solution if you have a large number of records. You might want to consider batching the records and making a single POST request with multiple records if your API supports it. Remember to handle any exceptions that might occur during the POST request to ensure that your destination doesn’t fail if a single request fails. Please refer to the tutorial for a more detailed explanation of how to implement the write function in a Python destination.

<@U04SE163WC8> this is part of spec.json file, how do I accesss the value of the options_title const value under the properties of the oneOf in my write function of my destination?

        "order": 3,
        "description": "Determines which API endpoint to POST details.",
        "title": "Endpoint",
        "type": "object",
        "display_type": "dropdown",
        "oneOf": [
          {
            "title": "Customer",
            "type": "object",
            "display_type": "dropdown",
            "required": ["option_title"],
            "properties": {
              "option_title": {
                "type": "string",
                "const": "Customer"
              },
              "BillAddr" : {
                "title": "Address",
                "type" : "object",
                "properties": {
                  "Line1": {
                    "title": "Address Line 1",
                    "type": "string"
                  },
                  "Line2": {
                    "title": "Address Line 2",
                    "type": "string"
                  },
                  "City": {
                    "title": "City",
                    "type": "string"
                  },
                  "Country": {
                    "title": "Country",
                    "type": "string"
                  },
                  "PostalCode": {
                    "title": "PostalCode",
                    "type": "integer"
                  }
                }
              },
              "Notes": {
                "title": "Notes",
                "type": "string"
              },

              "DisplayName": {
                "title": "DisplayName",
                "type": "string"
              },
              "PrimaryPhone": {
                "title": "PhoneNumber",
                "type": "integer"
              },
              "PrimaryEmailAddr": {
                "title": "Email Address",
                "type": "string"
              }
            }
          },
          {
            "title": "Employee",
            "type": "object",
            "display_type": "dropdown",
            "required": ["option_title"],
            "properties": {
              "option_title": {
                "type": "string",
                "const": "Employee"
              },
              "SSN": {
                "title": "Social Security Number",
                "type": "string"
              },
              "PrimaryAddr": {
                "title": "Primary Address",
                "type": "object",
                "properties": {
                  "Line1": {
                    "title": "Address Line 1",
                    "type": "string"
                  },
                  "Line2": {
                    "title": "Address Line 2",
                    "type": "string"
                  },
                  "City": {
                    "title": "City",
                    "type": "string"
                  },
                  "PostalCode": {
                    "title": "Postal Code",
                    "type": "integer"
                  }
                }
              },
              "GivenName": {
                "title": "Given Name",
                "type": "string"
              },
              "FamilyName": {
                "title": "Family Name",
                "type": "string"
              },
              "PrimaryPhone": {
                "title": "Phone Number",
                "type": "integer"
              }
            }
          }
        ]
      }```

A message from http://kapa.ai|kapa.ai

In your Python destination, you can access the option_title const value from the config parameter of the write function. The config parameter is a dictionary that contains the configuration options specified in the spec.json file. Here is a simplified version of how you can access the option_title const value: python def write(self, config: Dict[str, Any], ...): endpoint = config.get('endpoint') if endpoint: option_title = endpoint.get('option_title') # Now you can use option_title in your code In this code, config.get('endpoint') returns the endpoint object from the configuration. Then endpoint.get('option_title') returns the option_title value. Please note that this is a simplified version of the code and you might need to handle other aspects depending on your specific use case. For example, you might need to handle the case where endpoint or option_title is not present in the configuration. Also, please be aware that the option_title value will be either “Customer” or “Employee” depending on the option chosen by the user in the Airbyte UI, as specified by the oneOf keyword in your spec.json file.

<@U04SE163WC8> how do i test my write function of my python destination?

To test the write function of your Python destination, you can follow these steps: 1. Generate a sample messages.jsonl file that contains the data records you want to write to the destination. Each line in the file should be a JSON object that represents an AirbyteMessage. 2. Run your Python destination with the write command, passing the config.json and catalog.json files as arguments. You can do this from the command line as follows: bash cat messages.jsonl | python main.py write --config config.json --catalog catalog.json In this command, cat messages.jsonl outputs the contents of the messages.jsonl file, and the | operator pipes this output to the python main.py write command. The --config config.json and --catalog catalog.json options specify the configuration and catalog files. 3. Check the output of the command to see if the data records were written to the destination as expected. This method of testing is mentioned in the Github issue in the extracts. Please note that this is a simplified version of the testing process and you might need to handle other aspects depending on your specific use case. For example, you might need to set up a test environment or use a testing framework.