Acceptance test is failing on a schema with a "nested" cursor

Given a schema like so:

    "$schema": "",
    "name": "events",
    "type": "object",
    "properties": {
        "id": {"type": "integer"},
        "icon": {"type": "string"},
        "file": {"type": "string"},
        "item": {"type": "object"},
        "location": {"type": "string"},
        "created_at": {
            "type": "object",
            "properties": {
                "datetime": {"type": "string"},
                "formatted": {"type": "string"}
        "updated_at": {
            "type": "object",
            "properties": {
                "datetime": {"type": "string"},
                "formatted": {"type": "string"}
        "next_audit_date": {"type": "object"},
        "days_to_next_audit": {"type": "integer"},
        "action_type": {"type": "string"},
        "admin": {"type": "object"}

and a cursor field of “updated_at/datetime”, the test_defined_cursors_exist_in_schema() test in the acceptance test suite is failing and doesn’t look like it would ever succeed in my case:

    def test_defined_cursors_exist_in_schema(self, connector_config, discovered_catalog):
        Check if all of the source defined cursor fields are exists on stream's json schema.
        for stream_name, stream in discovered_catalog.items():
            if stream.default_cursor_field:
                schema = stream.json_schema
                assert "properties" in schema, "Top level item should have an 'object' type for {stream_name} stream schema"
                properties = schema["properties"]    # <--right here
                cursor_path = "/properties/".join(stream.default_cursor_field)
                    properties, cursor_path
                ), f"Some of defined cursor fields {stream.default_cursor_field} are not specified in discover schema properties for {stream_name} stream"

As it’s written, the test will extract the top-level “properties” key, but “properties”
keys in nested schemas will remain as-is, causing the subsequent call to fail because the dpath is going to be missing the 2nd “properties” .

Check out the following pdb postmortem:

(Pdb++) stream.default_cursor_field
(Pdb++) cursor_path 
(Pdb++), "/updated_at/datetime")
(Pdb++) properties
{'id': {'type': 'integer'}, 'icon': {'type': 'string'}, 'file': {'type': 'string'}, 'item': {'type': 'object'}, 'location': {'type': 'string'}, 'created_at': {'type': 'object'}, 'updated_at': {'type': 'object', 'properties': {'datetime': {'type': 'string'}, 'formatted': {'type': 'string'}}}, 'next_audit_date': {'type': 'object'}, 'days_to_next_audit': {'type': 'integer'}, 'action_type': {'type': 'string'}, 'admin': {'type': 'object'}}
{'updated_at': {'properties': {'datetime': {'type': 'string'}}}}

Assuming that the test is correct, how exactly are we supposed to specify the cursor field in an incremental stream?

Airbyte doesn’t support nested cursor fields for incremental syncs see issue.

For nested cursor fields did you read Airbyte’s docs about cursor?
See implementation for Jira issues stream or some Tiktok streams

I think the best way would be transform the record to extract the nested field to a higher level.

For nested cursor fields did you read Airbyte’s docs about cursor?

I did see that, yeah, but the list-style wasn’t working for me at the time and when I asked about it in Airbyte Slack someoene suggested the format I used above and it’s been working ever since.

The connector is working but only acceptance tests are not passing?

Yes, that’s right. The connector’s been working fine but the acceptance tests are not passing.

I created this issue in the Github:

Hi there from the Community Assistance team.
We’re letting you know about an issue we discovered with the back-end process we use to handle topics and responses on the forum. If you experienced a situation where you posted the last message in a topic that did not receive any further replies, please open a new topic to continue the discussion. In addition, if you’re having a problem and find a closed topic on the subject, go ahead and open a new topic on it and we’ll follow up with you. We apologize for the inconvenience, and appreciate your willingness to work with us to provide a supportive community.