2023-01-13 23:53:24 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:24 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. errors: $.method: must be a constant value Standard 2023-01-13 23:53:24 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:24 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:24 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/1/0/logs.log 2023-01-13 23:53:24 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: dev 2023-01-13 23:53:24 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation 2023-01-13 23:53:24 INFO i.a.c.EnvConfigs(getEnvOrDefault):1173 - Using default value for environment variable METRIC_CLIENT: '' 2023-01-13 23:53:24 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to 2023-01-13 23:53:24 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword example - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:24 INFO i.a.w.g.DefaultReplicationWorker(run):150 - start sync worker. job id: 1 attempt id: 0 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START REPLICATION ----- 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-13 23:53:24 INFO i.a.w.g.DefaultReplicationWorker(run):165 - configured sync modes: {public.deployments=incremental - append_dedup, public.projects=incremental - append_dedup, public.teams=incremental - append_dedup, public.prewarmed_instances=incremental - append_dedup} 2023-01-13 23:53:24 INFO i.a.w.i.DefaultAirbyteDestination(start):78 - Running destination... 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-convex:dev exists... 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-convex:dev was found locally. 2023-01-13 23:53:24 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = destination-convex-write-1-0-xorml with resources io.airbyte.config.ResourceRequirements@74f2aada[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-13 23:53:24 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/0 --log-driver none --name destination-convex-write-1-0-xorml --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/destination-convex:dev -e WORKER_JOB_ATTEMPT=0 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=dev -e WORKER_JOB_ID=1 airbyte/destination-convex:dev write --config destination_config.json --catalog destination_catalog.json 2023-01-13 23:53:24 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):33 - Writing messages to protocol version 0.2.0 2023-01-13 23:53:24 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):102 - Reading messages from protocol version 0.2.0 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-postgres:1.0.36 exists... 2023-01-13 23:53:24 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-postgres:1.0.36 was found locally. 2023-01-13 23:53:24 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = source-postgres-read-1-0-ofprh with resources io.airbyte.config.ResourceRequirements@3f41d1dd[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-13 23:53:24 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/0 --log-driver none --name source-postgres-read-1-0-ofprh --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:1.0.36 -e WORKER_JOB_ATTEMPT=0 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=dev -e WORKER_JOB_ID=1 airbyte/source-postgres:1.0.36 read --config source_config.json --catalog source_catalog.json 2023-01-13 23:53:24 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):102 - Reading messages from protocol version 0.2.0 2023-01-13 23:53:24 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):273 - Destination output thread started. 2023-01-13 23:53:24 INFO i.a.w.g.DefaultReplicationWorker(replicate):245 - Waiting for source and destination threads to complete. 2023-01-13 23:53:24 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):335 - Replication thread started. 2023-01-13 23:53:24 source > Running source under deployment mode: OSS 2023-01-13 23:53:24 source > Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-13 23:53:24 source > integration args: {read=null, catalog=source_catalog.json, config=source_config.json} 2023-01-13 23:53:24 source > Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-13 23:53:24 source > Command: READ 2023-01-13 23:53:24 source > Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'} 2023-01-13 23:53:25 source > Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:25 source > Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:25 source > Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:25 destination > Begin writing to the destination... 2023-01-13 23:53:25 source > Starting connection with method: NO_TUNNEL 2023-01-13 23:53:25 source > using CDC: true 2023-01-13 23:53:25 source > HikariPool-1 - Starting... 2023-01-13 23:53:25 source > HikariPool-1 - Start completed. 2023-01-13 23:53:25 source > using CDC: true 2023-01-13 23:53:25 source > Attempting to get metadata from the database to see if we can connect. 2023-01-13 23:53:25 source > Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@151515831 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot' AND plugin = 'pgoutput' AND database = 'convex' 2023-01-13 23:53:25 source > Set initial fetch size: 10 rows 2023-01-13 23:53:25 source > Attempting to find the publication using the query: HikariProxyPreparedStatement@327840833 wrapping SELECT * FROM pg_publication WHERE pubname = 'airbyte_publication' 2023-01-13 23:53:25 source > Set initial fetch size: 10 rows 2023-01-13 23:53:25 source > HikariPool-1 - Shutdown initiated... 2023-01-13 23:53:25 source > HikariPool-1 - Shutdown completed. 2023-01-13 23:53:25 source > using CDC: true 2023-01-13 23:53:25 source > using CDC: true 2023-01-13 23:53:25 source > Global state manager selected to manage state object with type GLOBAL. 2023-01-13 23:53:25 source > No cursor field set in catalog but not present in state. Stream: public_prewarmed_instances, New Cursor Field: null. Resetting cursor value 2023-01-13 23:53:25 source > No cursor field set in catalog but not present in state. Stream: public_projects, New Cursor Field: null. Resetting cursor value 2023-01-13 23:53:25 source > No cursor field set in catalog but not present in state. Stream: public_deployments, New Cursor Field: null. Resetting cursor value 2023-01-13 23:53:25 source > No cursor field set in catalog but not present in state. Stream: public_teams, New Cursor Field: null. Resetting cursor value 2023-01-13 23:53:25 source > Initialized CDC state with: io.airbyte.integrations.source.relationaldb.models.CdcState@412c995d[state=,additionalProperties={}] 2023-01-13 23:53:25 source > HikariPool-2 - Starting... 2023-01-13 23:53:25 source > HikariPool-2 - Start completed. 2023-01-13 23:53:25 source > Checking schema: public 2023-01-13 23:53:25 source > Internal schemas to exclude: [catalog_history, information_schema, pg_catalog, pg_internal] 2023-01-13 23:53:25 source > Set initial fetch size: 10 rows 2023-01-13 23:53:25 source > Max memory limit: 9403629568, JDBC buffer size: 1073741824 2023-01-13 23:53:25 source > Table promotion_redemptions column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotion_redemptions column promotion_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotion_redemptions column actor (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotion_redemptions column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotion_redemptions column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotion_redemptions column redemption_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table promotion_redemptions column expiration_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table promotion_redemptions column expired (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table invitations column code (type uuid[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table invitations column email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table invitations column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table invitations column issued (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table invitations column expiration (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table invitations column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table plans column id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plans column friendly_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table members column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table members column email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table members column tos_accept_version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table members column tos_accept_timestamp (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table members column auth0_subject (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table members column auth0_email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table current_backend_version_journal column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table current_backend_version_journal column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table current_backend_version_journal column ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-13 23:53:25 source > Table promotions column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotions column start_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table promotions column end_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table promotions column code (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table promotions column entitlement_duration_sec (type int4[10], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotions column team_promotion (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table old_instances column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table old_instances column db_password (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table old_instances column instance_secret (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table old_instances column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table old_instances column project_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table old_instances column active (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table old_instances column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table old_instances column db_cluster (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table old_instances column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table old_instances column creation_ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-13 23:53:25 source > Table deployments column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table deployments column project_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table deployments column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table deployments column state (type deployment_state[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table deployments column dtype (type deployment_type[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table deployments column instance_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table deployments column creation_ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-13 23:53:25 source > Table deployments column need_backend_info_refresh (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table plan_subscription_log column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table plan_subscription_log column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table plan_subscription_log column plan_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plan_subscription_log column ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table plan_subscription_log column stripe_subscription_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plan_subscription_log column stripe_customer_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plan_subscription_log column stripe_event_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plan_subscription_log column reason (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table entitlement_grants column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table entitlement_grants column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table entitlement_grants column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table entitlement_grants column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table entitlement_grants column creation_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table entitlement_grants column expiration_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table entitlement_grants column expired (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table entitlement_grants column value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table entitlement_grants column operator (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table entitlement_grants column reason (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table promotion_entitlements column promotion_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table promotion_entitlements column entitlement_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table promotion_entitlements column entitlement_value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table teams column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table teams column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table teams column slug (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table teams column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table deprecated_beta_keys column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table deprecated_beta_keys column body (type uuid[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table deprecated_beta_keys column email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column db_password (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column instance_secret (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column db_cluster (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column state (type instance_state[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table instances column modification_ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-13 23:53:25 source > Table db_clusters column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table db_clusters column db_driver (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table db_clusters column url (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table db_clusters column weight (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table db_clusters column replicas (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table authorized_devices column token (type uuid[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table authorized_devices column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table authorized_devices column device_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table authorized_devices column creation_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table team_member column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table team_member column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table projects column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table projects column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table projects column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table projects column slug (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table projects column deleted (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table plan_entitlements column plan_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plan_entitlements column entitlement_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table plan_entitlements column entitlement_value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table prewarmed_instances column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table prewarmed_instances column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table prewarmed_instances column db_password (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table prewarmed_instances column instance_secret (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table prewarmed_instances column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table prewarmed_instances column db_cluster (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table entitlements_snapshot column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table entitlements_snapshot column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table entitlements_snapshot column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table entitlements_snapshot column update_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table entitlements_snapshot column value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table _sqlx_migrations column version (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Table _sqlx_migrations column description (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table _sqlx_migrations column installed_on (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-13 23:53:25 source > Table _sqlx_migrations column success (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-13 23:53:25 source > Table _sqlx_migrations column checksum (type bytea[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-13 23:53:25 source > Table _sqlx_migrations column execution_time (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-13 23:53:25 source > Found table: public.promotion_redemptions 2023-01-13 23:53:25 source > Found table: public.invitations 2023-01-13 23:53:25 source > Found table: public.plans 2023-01-13 23:53:25 source > Found table: public.members 2023-01-13 23:53:25 source > Found table: public.current_backend_version_journal 2023-01-13 23:53:25 source > Found table: public.promotions 2023-01-13 23:53:25 source > Found table: public.old_instances 2023-01-13 23:53:25 source > Found table: public.deployments 2023-01-13 23:53:25 source > Found table: public.plan_subscription_log 2023-01-13 23:53:25 source > Found table: public.entitlement_grants 2023-01-13 23:53:25 source > Found table: public.promotion_entitlements 2023-01-13 23:53:25 source > Found table: public.teams 2023-01-13 23:53:25 source > Found table: public.deprecated_beta_keys 2023-01-13 23:53:25 source > Found table: public.instances 2023-01-13 23:53:25 source > Found table: public.db_clusters 2023-01-13 23:53:25 source > Found table: public.authorized_devices 2023-01-13 23:53:25 source > Found table: public.team_member 2023-01-13 23:53:25 source > Found table: public.projects 2023-01-13 23:53:25 source > Found table: public.plan_entitlements 2023-01-13 23:53:25 source > Found table: public.prewarmed_instances 2023-01-13 23:53:25 source > Found table: public.entitlements_snapshot 2023-01-13 23:53:25 source > Found table: public._sqlx_migrations 2023-01-13 23:53:25 source > using CDC: true 2023-01-13 23:53:25 source > Set initial fetch size: 10 rows 2023-01-13 23:53:25 source > For CDC, only tables in publication airbyte_publication will be included in the sync: [public.prewarmed_instances, public.projects, public.teams, public.deployments] 2023-01-13 23:53:25 source > using CDC: true 2023-01-13 23:53:25 source > First record waiting time: 300 seconds 2023-01-13 23:53:25 source > First record waiting time: 300 seconds 2023-01-13 23:53:25 source > Should flush after sync: true 2023-01-13 23:53:25 source > StandaloneConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [localhost:9092] client.dns.lookup = use_all_dns_ips config.providers = [] connector.client.config.override.policy = All header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter key.converter = class org.apache.kafka.connect.json.JsonConverter listeners = [http://:8083] metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 1000 offset.flush.timeout.ms = 5000 offset.storage.file.filename = /tmp/cdc-state-offset10664696055430554159/offset.dat plugin.path = null response.http.headers.config = rest.advertised.host.name = null rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS task.shutdown.graceful.timeout.ms = 5000 topic.creation.enable = true topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class org.apache.kafka.connect.json.JsonConverter 2023-01-13 23:53:25 source > Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2023-01-13 23:53:25 source > Starting FileOffsetBackingStore with file /tmp/cdc-state-offset10664696055430554159/offset.dat 2023-01-13 23:53:25 source > JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = true 2023-01-13 23:53:25 source > JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-13 23:53:25 source > No previous offsets found 2023-01-13 23:53:25 source > Closing offsetStorageReader and fileOffsetBackingStore 2023-01-13 23:53:25 source > Stopped FileOffsetBackingStore 2023-01-13 23:53:25 source > Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@862152124 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot' AND plugin = 'pgoutput' AND database = 'convex' 2023-01-13 23:53:25 source > Set initial fetch size: 10 rows 2023-01-13 23:53:25 source > Should flush after sync: true 2023-01-13 23:53:25 source > identified target lsn: PgLsn{lsn=24248618880} 2023-01-13 23:53:25 source > Should flush after sync: true 2023-01-13 23:53:25 source > Using CDC: true 2023-01-13 23:53:25 source > JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = true 2023-01-13 23:53:25 source > JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-13 23:53:25 source > EmbeddedConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [localhost:9092] client.dns.lookup = use_all_dns_ips config.providers = [] connector.client.config.override.policy = All header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter key.converter = class org.apache.kafka.connect.json.JsonConverter listeners = [http://:8083] metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 1000 offset.flush.timeout.ms = 5000 offset.storage.file.filename = /tmp/cdc-state-offset8634395537370491861/offset.dat offset.storage.partitions = null offset.storage.replication.factor = null offset.storage.topic = plugin.path = null response.http.headers.config = rest.advertised.host.name = null rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS task.shutdown.graceful.timeout.ms = 5000 topic.creation.enable = true topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class org.apache.kafka.connect.json.JsonConverter 2023-01-13 23:53:25 source > The worker has been configured with one or more internal converter properties ([internal.key.converter, internal.value.converter]). Support for these properties was deprecated in version 2.0 and removed in version 3.0, and specifying them will have no effect. Instead, an instance of the JsonConverter with schemas.enable set to false will be used. For more information, please visit http://kafka.apache.org/documentation/#upgrade and consult the upgrade notesfor the 3.0 release. 2023-01-13 23:53:25 source > Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2023-01-13 23:53:25 source > JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-13 23:53:25 source > JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-13 23:53:25 source > Starting FileOffsetBackingStore with file /tmp/cdc-state-offset8634395537370491861/offset.dat 2023-01-13 23:53:25 source > Configuration property 'truncate.handling.mode' is deprecated and will be removed soon. If you wish to retain skipped truncate functionality, please configure 'skipped.operations' with "t". 2023-01-13 23:53:25 source > Property 'flush.lsn.source' is set to 'false', the LSN will not be flushed to the database source and WAL logs will not be cleared. User is expected to handle this outside Debezium. 2023-01-13 23:53:25 source > Configuration property 'truncate.handling.mode' is deprecated and will be removed in future versions. Please use 'skipped.operations' instead. 2023-01-13 23:53:25 source > Configuration property 'toasted.value.placeholder' is deprecated and will be removed in future versions. Please use 'unavailable.value.placeholder' instead. 2023-01-13 23:53:25 source > Starting PostgresConnectorTask with configuration: 2023-01-13 23:53:25 source > connector.class = io.debezium.connector.postgresql.PostgresConnector 2023-01-13 23:53:25 source > max.queue.size = 8192 2023-01-13 23:53:25 source > slot.name = airbyte_slot 2023-01-13 23:53:25 source > publication.name = airbyte_publication 2023-01-13 23:53:25 source > offset.storage.file.filename = /tmp/cdc-state-offset8634395537370491861/offset.dat 2023-01-13 23:53:25 source > decimal.handling.mode = string 2023-01-13 23:53:25 source > flush.lsn.source = false 2023-01-13 23:53:25 source > converters = datetime 2023-01-13 23:53:25 source > datetime.type = io.airbyte.integrations.debezium.internals.PostgresConverter 2023-01-13 23:53:25 source > value.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-13 23:53:25 source > key.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-13 23:53:25 source > publication.autocreate.mode = disabled 2023-01-13 23:53:25 source > database.user = emmaling 2023-01-13 23:53:25 source > database.dbname = convex 2023-01-13 23:53:25 source > offset.storage = org.apache.kafka.connect.storage.FileOffsetBackingStore 2023-01-13 23:53:25 source > database.server.name = convex 2023-01-13 23:53:25 source > offset.flush.timeout.ms = 5000 2023-01-13 23:53:25 source > heartbeat.interval.ms = 10000 2023-01-13 23:53:25 source > column.include.list = \Qpublic.deployments\E\.(\Qid\E|\Qdtype\E|\Qstate\E|\Qcreator\E|\Qproject_id\E|\Q_ab_cdc_lsn\E|\Qcreation_ts\E|\Qinstance_name\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E|\Qneed_backend_info_refresh\E),\Qpublic.teams\E\.(\Qid\E|\Qname\E|\Qslug\E|\Qcreator\E|\Q_ab_cdc_lsn\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E),\Qpublic.projects\E\.(\Qid\E|\Qname\E|\Qslug\E|\Qdeleted\E|\Qteam_id\E|\Q_ab_cdc_lsn\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E),\Qpublic.prewarmed_instances\E\.(\Qid\E|\Qname\E|\Qversion\E|\Qdb_cluster\E|\Q_ab_cdc_lsn\E|\Qdb_password\E|\Qinstance_secret\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E) 2023-01-13 23:53:25 source > plugin.name = pgoutput 2023-01-13 23:53:25 source > database.port = 5432 2023-01-13 23:53:25 source > offset.flush.interval.ms = 1000 2023-01-13 23:53:25 source > key.converter.schemas.enable = false 2023-01-13 23:53:25 source > internal.key.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-13 23:53:25 source > include.unknown.datatypes = true 2023-01-13 23:53:25 source > database.hostname = host.docker.internal 2023-01-13 23:53:25 source > name = convex 2023-01-13 23:53:25 source > value.converter.schemas.enable = false 2023-01-13 23:53:25 source > internal.value.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-13 23:53:25 source > max.batch.size = 2048 2023-01-13 23:53:25 source > table.include.list = \Qpublic.deployments\E,\Qpublic.teams\E,\Qpublic.projects\E,\Qpublic.prewarmed_instances\E 2023-01-13 23:53:25 source > snapshot.mode = initial 2023-01-13 23:53:25 source > Connection gracefully closed 2023-01-13 23:53:25 source > No previous offsets found 2023-01-13 23:53:25 source > user 'emmaling' connected to database 'convex' on PostgreSQL 14.6 (Homebrew) on aarch64-apple-darwin22.1.0, compiled by Apple clang version 14.0.0 (clang-1400.0.29.202), 64-bit with roles: role 'impolite_fish_831' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'peaceful_horse_578' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_knowing_partridge_253' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_formal_llama_678' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_outgoing_chimpanzee_418' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fleet_eagle_631' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_brave_cormorant_724' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'adventurous_heron_271' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_little_dunlin_209' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'secret_hare_976' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'judicious_lapwing_404' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'clear_kookabura_918' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_coordinated_goose_728' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'third_sheep_819' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'merry_tiger_926' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_fine_flamingo_545' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'optimistic_quelea_685' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'lazy_lobster_517' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'enchanted_pony_474' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] role 'kindred_leopard_137' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_zany_monkey_512' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'helpful_dunlin_171' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'late_louse_209' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_jaded_caribou_306' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'lovable_hawk_746' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'agile_quail_902' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'false_gazelle_324' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_nautical_antelope_43' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'marvelous_reindeer_171' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_imperfect_cassowary_499' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_flippant_capybara_855' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'innocent_worm_737' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'uncommon_crow_369' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'wandering_bee_475' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fiery_kangaroo_814' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'superb_spider_100' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_colorless_finch_234' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_woozy_snail_615' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_giddy_crow_472' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fortunate_elk_646' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'standing_echidna_736' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_brave_beaver_478' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_cagey_turtle_651' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'dapper_grasshopper_973' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fortunate_curlew_905' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'jovial_shrew_827' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'hushed_mandrill_786' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_dashing_antelope_221' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'formal_kookabura_682' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'bizarre_coyote_440' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'judicious_llama_288' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'beloved_porcupine_444' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'outgoing_leopard_800' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_avid_nightingale_29' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_canny_cobra_785' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_keen_gnu_542' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'glad_pig_33' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_third_wolverine_683' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'uncommon_elephant_829' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'abundant_hedgehog_159' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'standing_capybara_955' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'famous_scorpion_641' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_next_guanaco_86' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_tangible_kingfisher_906' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_tangible_dunlin_929' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'rosy_zebra_28' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_rapid_lobster_434' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'obsolete_gnu_171' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'charming_porpoise_845' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'trustworthy_chicken_266' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'standing_frog_145' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'pg_read_all_data' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] role 'test_shocking_bat_673' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_avid_raven_375' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'mild_leopard_957' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_resolute_mouse_303' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'bleak_dragonfly_28' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'vivid_sandpiper_368' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_kindly_crab_276' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_tricky_gnat_993' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_quick_dotterel_122' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'whimsical_otter_157' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'dynamic_hamster_90' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'charming_sheep_976' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'knowing_capybara_249' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_descriptive_coyote_8' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'lovable_mouse_933' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'nautical_meerkat_841' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'bright_chinchilla_839' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_admired_shark_340' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'intent_tarsier_75' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_naive_porcupine_95' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_robust_chinchilla_208' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_childlike_wolverine_269_dev' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'uncommon_herring_177' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'quiet_seahorse_147' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'doubtful_goshawk_423' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'silent_mallard_368' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'compassionate_albatross_711' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_vibrant_anteater_455' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'brainy_boar_354' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'pastel_woodpecker_988' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'canny_dragonfly_581' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'superb_albatross_558' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'reminiscent_lobster_88' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_disagreeable_stingray_78' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_confused_quail_841' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_naive_rat_268' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_different_chicken_55_dev' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_opulent_giraffe_226' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'dusty_meerkat_32' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_shiny_chough_534' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_woozy_eagle_243' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'upbeat_locust_160' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'mild_hawk_285' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'focused_lark_318' [superuser: false, replicat… 2023-01-13 23:53:25 source > Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{5/A3E3A208}, catalogXmin=111632588] 2023-01-13 23:53:25 source > No previous offset found 2023-01-13 23:53:25 source > Taking initial snapshot for new datasource 2023-01-13 23:53:25 source > Requested thread factory for connector PostgresConnector, id = convex named = change-event-source-coordinator 2023-01-13 23:53:25 source > Creating thread debezium-postgresconnector-convex-change-event-source-coordinator 2023-01-13 23:53:25 source > Metrics registered 2023-01-13 23:53:25 source > Context created 2023-01-13 23:53:25 source > Taking initial snapshot for new datasource 2023-01-13 23:53:25 source > According to the connector configuration data will be snapshotted 2023-01-13 23:53:25 source > Snapshot step 1 - Preparing 2023-01-13 23:53:25 source > Snapshot step 2 - Determining captured tables 2023-01-13 23:53:25 source > Adding table public.deployments to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.promotion_redemptions to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.promotions to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.promotion_entitlements to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.plan_entitlements to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.old_instances to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public._sqlx_migrations to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.invitations to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.prewarmed_instances to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.team_member to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.authorized_devices to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.entitlement_grants to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.projects to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.deprecated_beta_keys to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.members to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.current_backend_version_journal to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.entitlements_snapshot to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.instances to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.db_clusters to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.teams to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.plan_subscription_log to the list of capture schema tables 2023-01-13 23:53:25 source > Adding table public.plans to the list of capture schema tables 2023-01-13 23:53:25 source > Snapshot step 3 - Locking captured tables [public.deployments, public.teams, public.projects, public.prewarmed_instances] 2023-01-13 23:53:25 source > Snapshot step 4 - Determining snapshot offset 2023-01-13 23:53:25 source > Creating initial offset context 2023-01-13 23:53:25 source > Read xlogStart at 'LSN{5/A5548F80}' from transaction '111644692' 2023-01-13 23:53:25 source > Read xlogStart at 'LSN{5/A5548F80}' from transaction '111644692' 2023-01-13 23:53:25 source > Snapshot step 5 - Reading structure of captured tables 2023-01-13 23:53:25 source > Reading structure of schema 'public' of catalog 'convex' 2023-01-13 23:53:25 source > Snapshot step 6 - Persisting schema history 2023-01-13 23:53:25 source > Snapshot step 7 - Snapshotting data 2023-01-13 23:53:25 source > Snapshotting contents of 4 tables while still in transaction 2023-01-13 23:53:25 source > Exporting data from table 'public.deployments' (1 of 4 tables) 2023-01-13 23:53:25 source > For table 'public.deployments' using select statement: 'SELECT "id", "project_id", "creator", "state", "dtype", "instance_name", "creation_ts", "need_backend_info_refresh" FROM "public"."deployments"' 2023-01-13 23:53:25 source > Finished exporting 5 records for table 'public.deployments'; total duration '00:00:00.01' 2023-01-13 23:53:25 source > Exporting data from table 'public.teams' (2 of 4 tables) 2023-01-13 23:53:25 source > For table 'public.teams' using select statement: 'SELECT "id", "name", "slug", "creator" FROM "public"."teams"' 2023-01-13 23:53:25 source > Finished exporting 1 records for table 'public.teams'; total duration '00:00:00.003' 2023-01-13 23:53:25 source > Exporting data from table 'public.projects' (3 of 4 tables) 2023-01-13 23:53:25 source > For table 'public.projects' using select statement: 'SELECT "id", "name", "team_id", "slug", "deleted" FROM "public"."projects"' 2023-01-13 23:53:25 source > Finished exporting 5 records for table 'public.projects'; total duration '00:00:00.003' 2023-01-13 23:53:25 source > Exporting data from table 'public.prewarmed_instances' (4 of 4 tables) 2023-01-13 23:53:25 source > For table 'public.prewarmed_instances' using select statement: 'SELECT "id", "name", "db_password", "instance_secret", "version", "db_cluster" FROM "public"."prewarmed_instances"' 2023-01-13 23:53:25 source > Finished exporting 0 records for table 'public.prewarmed_instances'; total duration '00:00:00.002' 2023-01-13 23:53:25 source > Snapshot - Final stage 2023-01-13 23:53:25 source > Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='convex'db='convex', lsn=LSN{5/A5548F80}, txId=111644692, timestamp=2023-01-13T23:53:25.581352Z, snapshot=FALSE, schema=public, table=projects], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]] 2023-01-13 23:53:25 source > Connected metrics set to 'true' 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.deployments' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.projects' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.prewarmed_instances' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.teams' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > Starting streaming 2023-01-13 23:53:25 source > Retrieved latest position from stored offset 'LSN{5/A5548F80}' 2023-01-13 23:53:25 source > Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{5/A5548F80}' 2023-01-13 23:53:25 source > Initializing PgOutput logical decoder publication 2023-01-13 23:53:25 source > Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{5/A3E3A208}, catalogXmin=111632588] 2023-01-13 23:53:25 source > Connection gracefully closed 2023-01-13 23:53:25 source > Requested thread factory for connector PostgresConnector, id = convex named = keep-alive 2023-01-13 23:53:25 source > Creating thread debezium-postgresconnector-convex-keep-alive 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.deployments' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.projects' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.prewarmed_instances' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > REPLICA IDENTITY for 'public.teams' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-13 23:53:25 source > Searching for WAL resume position 2023-01-13 23:53:26 WARN c.n.s.JsonMetaSchema(newValidator):278 - Unknown keyword airbyte_type - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2022-12-19T22:43:41.719905 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2022-12-19T22:43:41.719905 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2022-12-19T22:43:41.719905 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-06T18:20:09.594287 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-06T18:20:09.594287 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-06T18:20:09.594287 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T20:56:14.138516 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T20:56:14.138516 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T20:56:14.138516 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T21:01:21.983729 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T21:01:21.983729 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T21:01:21.983729 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-12T00:20:51.605565 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-12T00:20:51.605565 2023-01-13 23:53:26 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-12T00:20:51.605565 2023-01-13 23:53:26 source > Signalling close because Snapshot is complete 2023-01-13 23:53:26 source > Closing: Change event reached target position 2023-01-13 23:53:26 source > Stopping the embedded engine 2023-01-13 23:53:26 source > Waiting for PT5M for connector to stop 2023-01-13 23:53:26 source > Stopping the task and engine 2023-01-13 23:53:26 source > Stopping down connector 2023-01-13 23:53:26 source > WAL resume position 'null' discovered 2023-01-13 23:53:26 source > Connection gracefully closed 2023-01-13 23:53:26 source > Connection gracefully closed 2023-01-13 23:53:36 source > Producer failure Stack Trace: org.postgresql.util.PSQLException: The connection attempt failed. at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:331) at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49) at org.postgresql.jdbc.PgConnection.(PgConnection.java:223) at org.postgresql.Driver.makeConnection(Driver.java:402) at org.postgresql.Driver.connect(Driver.java:261) at io.debezium.jdbc.JdbcConnection.lambda$patternBasedFactory$1(JdbcConnection.java:244) at io.debezium.jdbc.JdbcConnection.connection(JdbcConnection.java:888) at io.debezium.connector.postgresql.connection.PostgresReplicationConnection.reconnect(PostgresReplicationConnection.java:660) at io.debezium.connector.postgresql.PostgresStreamingChangeEventSource.execute(PostgresStreamingChangeEventSource.java:172) at io.debezium.connector.postgresql.PostgresStreamingChangeEventSource.execute(PostgresStreamingChangeEventSource.java:41) at io.debezium.pipeline.ChangeEventSourceCoordinator.streamEvents(ChangeEventSourceCoordinator.java:174) at io.debezium.pipeline.ChangeEventSourceCoordinator.executeChangeEventSources(ChangeEventSourceCoordinator.java:141) at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:109) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: java.net.SocketTimeoutException: Connect timed out at java.base/sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:546) at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:597) at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) at java.base/java.net.Socket.connect(Socket.java:633) at org.postgresql.core.PGStream.createSocket(PGStream.java:241) at org.postgresql.core.PGStream.(PGStream.java:98) at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:109) at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:235) ... 17 more 2023-01-13 23:53:36 source > Connection gracefully closed 2023-01-13 23:53:36 source > Finished streaming 2023-01-13 23:53:36 source > Connected metrics set to 'false' 2023-01-13 23:53:36 source > Stopped FileOffsetBackingStore 2023-01-13 23:53:36 source > Debezium engine shutdown. 2023-01-13 23:53:36 source > Closing: Heartbeat indicates sync is done 2023-01-13 23:53:36 source > Closing: Iterator closing 2023-01-13 23:53:36 source > debezium state: {"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248618880,\"txId\":111644692,\"ts_usec\":1673654005581352}"} 2023-01-13 23:53:36 source > Closing database connection pool. 2023-01-13 23:53:36 source > HikariPool-2 - Shutdown initiated... 2023-01-13 23:53:36 source > HikariPool-2 - Shutdown completed. 2023-01-13 23:53:36 source > Closed database connection pool. 2023-01-13 23:53:36 source > Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-13 23:53:36 source > Completed source: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-13 23:53:36 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):383 - Source has no more messages, closing connection. 2023-01-13 23:53:36 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):392 - Total records read: 12 (2 KB) 2023-01-13 23:53:36 WARN i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):395 - Schema validation errors found for stream public_projects. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2023-01-13 23:53:36 WARN i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):395 - Schema validation errors found for stream public_deployments. Error messages: [$.creation_ts is of an incorrect type. Expected it to be date-time, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2023-01-13 23:53:36 WARN i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):395 - Schema validation errors found for stream public_teams. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2023-01-13 23:53:36 INFO i.a.w.g.DefaultReplicationWorker(replicate):250 - One of source or destination thread complete. Waiting on the other. 2023-01-13 23:53:36 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):284 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@3c9ba20f[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@6591f813[type=GLOBAL,stream=,global=,data={"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248618880,\"txId\":111644692,\"ts_usec\":1673654005581352}"}},"streams":[{"stream_name":"deployments","stream_namespace":"public","cursor_field":[]},{"stream_name":"prewarmed_instances","stream_namespace":"public","cursor_field":[]},{"stream_name":"projects","stream_namespace":"public","cursor_field":[]},{"stream_name":"teams","stream_namespace":"public","cursor_field":[]}]},additionalProperties={global_={shared_state={state={{"schema":null,"payload":["convex",{"server":"convex"}]}={"transaction_id":null,"lsn":24248618880,"txId":111644692,"ts_usec":1673654005581352}}}, stream_states=[{stream_descriptor={name=deployments, namespace=public}, stream_state={stream_name=deployments, stream_namespace=public, cursor_field=[]}}, {stream_descriptor={name=prewarmed_instances, namespace=public}, stream_state={stream_name=prewarmed_instances, stream_namespace=public, cursor_field=[]}}, {stream_descriptor={name=projects, namespace=public}, stream_state={stream_name=projects, stream_namespace=public, cursor_field=[]}}, {stream_descriptor={name=teams, namespace=public}, stream_state={stream_name=teams, stream_namespace=public, cursor_field=[]}}]}}],trace=,control=,additionalProperties={}] 2023-01-13 23:53:36 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):226 - The message tracker encountered an issue that prevents committed record counts from being reliably computed. 2023-01-13 23:53:36 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):227 - This only impacts metadata and does not indicate a problem with actual sync data. 2023-01-13 23:53:36 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):228 - Delta was not stored for state hash 1727035794 io.airbyte.workers.internal.book_keeping.StateDeltaTracker$StateDeltaTrackerException: Delta was not stored for state hash 1727035794 at io.airbyte.workers.internal.book_keeping.StateDeltaTracker.commitStateHash(StateDeltaTracker.java:126) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.handleDestinationEmittedState(AirbyteMessageTracker.java:223) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.acceptFromDestination(AirbyteMessageTracker.java:146) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:286) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] 2023-01-13 23:53:36 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):237 - The state message tracker was unable to match the destination state message to a corresponding source state message. 2023-01-13 23:53:36 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):238 - This only impacts metrics and does not indicate a problem with actual sync data. 2023-01-13 23:53:36 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):239 - Destination state message cannot be matched to corresponding Source state message. io.airbyte.workers.internal.book_keeping.StateMetricsTracker$StateMetricsTrackerNoStateMatchException: Destination state message cannot be matched to corresponding Source state message. at io.airbyte.workers.internal.book_keeping.StateMetricsTracker.findStartingTimeStampAndRemoveOlderEntries(StateMetricsTracker.java:147) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.StateMetricsTracker.updateStates(StateMetricsTracker.java:82) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.handleDestinationEmittedState(AirbyteMessageTracker.java:234) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.acceptFromDestination(AirbyteMessageTracker.java:146) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:286) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] 2023-01-13 23:53:36 destination > Writing complete. 2023-01-13 23:53:37 INFO i.a.w.g.DefaultReplicationWorker(replicate):252 - Source and destination threads complete. 2023-01-13 23:53:37 INFO i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):550 - Source output at least one state message 2023-01-13 23:53:37 INFO i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):556 - State capture: Updated state to: Optional[io.airbyte.config.State@2a6882bb[state=[{"type":"GLOBAL","global_":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248618880,\"txId\":111644692,\"ts_usec\":1673654005581352}"}},"stream_states":[{"stream_descriptor":{"name":"deployments","namespace":"public"},"stream_state":{"stream_name":"deployments","stream_namespace":"public","cursor_field":[]}},{"stream_descriptor":{"name":"prewarmed_instances","namespace":"public"},"stream_state":{"stream_name":"prewarmed_instances","stream_namespace":"public","cursor_field":[]}},{"stream_descriptor":{"name":"projects","namespace":"public"},"stream_state":{"stream_name":"projects","stream_namespace":"public","cursor_field":[]}},{"stream_descriptor":{"name":"teams","namespace":"public"},"stream_state":{"stream_name":"teams","stream_namespace":"public","cursor_field":[]}}]}}]]] 2023-01-13 23:53:37 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):483 - sync summary: { "status" : "completed", "recordsSynced" : 11, "bytesSynced" : 2480, "startTime" : 1673654004371, "endTime" : 1673654017084, "totalStats" : { "bytesEmitted" : 2480, "destinationStateMessagesEmitted" : 1, "destinationWriteEndTime" : 1673654017081, "destinationWriteStartTime" : 1673654004427, "meanSecondsBeforeSourceStateMessageEmitted" : 10, "maxSecondsBeforeSourceStateMessageEmitted" : 10, "recordsEmitted" : 11, "recordsCommitted" : 11, "replicationEndTime" : 1673654017082, "replicationStartTime" : 1673654004371, "sourceReadEndTime" : 1673654016913, "sourceReadStartTime" : 1673654004400, "sourceStateMessagesEmitted" : 1 }, "streamStats" : [ { "streamName" : "deployments", "stats" : { "bytesEmitted" : 1407, "recordsEmitted" : 5, "recordsCommitted" : 5 } }, { "streamName" : "teams", "stats" : { "bytesEmitted" : 167, "recordsEmitted" : 1, "recordsCommitted" : 1 } }, { "streamName" : "projects", "stats" : { "bytesEmitted" : 906, "recordsEmitted" : 5, "recordsCommitted" : 5 } } ] } 2023-01-13 23:53:37 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):484 - failures: [ ] 2023-01-13 23:53:37 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-01-13 23:53:37 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-13 23:53:37 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END REPLICATION ----- 2023-01-13 23:53:37 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-13 23:53:37 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):209 - sync summary: io.airbyte.config.StandardSyncOutput@1085f1b0[standardSyncSummary=io.airbyte.config.StandardSyncSummary@78c0c214[status=completed,recordsSynced=11,bytesSynced=2480,startTime=1673654004371,endTime=1673654017084,totalStats=io.airbyte.config.SyncStats@786dbbe5[bytesEmitted=2480,destinationStateMessagesEmitted=1,destinationWriteEndTime=1673654017081,destinationWriteStartTime=1673654004427,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=10,maxSecondsBeforeSourceStateMessageEmitted=10,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=11,recordsCommitted=11,replicationEndTime=1673654017082,replicationStartTime=1673654004371,sourceReadEndTime=1673654016913,sourceReadStartTime=1673654004400,sourceStateMessagesEmitted=1,additionalProperties={}],streamStats=[io.airbyte.config.StreamSyncStats@6e09fe3c[streamName=deployments,streamNamespace=,stats=io.airbyte.config.SyncStats@51be9a3c[bytesEmitted=1407,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=5,recordsCommitted=5,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}]], io.airbyte.config.StreamSyncStats@2f10227d[streamName=teams,streamNamespace=,stats=io.airbyte.config.SyncStats@3a3706e9[bytesEmitted=167,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=1,recordsCommitted=1,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}]], io.airbyte.config.StreamSyncStats@1b6ef78d[streamName=projects,streamNamespace=,stats=io.airbyte.config.SyncStats@3ea4e1a6[bytesEmitted=906,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=5,recordsCommitted=5,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}]]]],normalizationSummary=,webhookOperationSummary=,state=io.airbyte.config.State@2a6882bb[state=[{"type":"GLOBAL","global_":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248618880,\"txId\":111644692,\"ts_usec\":1673654005581352}"}},"stream_states":[{"stream_descriptor":{"name":"deployments","namespace":"public"},"stream_state":{"stream_name":"deployments","stream_namespace":"public","cursor_field":[]}},{"stream_descriptor":{"name":"prewarmed_instances","namespace":"public"},"stream_state":{"stream_name":"prewarmed_instances","stream_namespace":"public","cursor_field":[]}},{"stream_descriptor":{"name":"projects","namespace":"public"},"stream_state":{"stream_name":"projects","stream_namespace":"public","cursor_field":[]}},{"stream_descriptor":{"name":"teams","namespace":"public"},"stream_state":{"stream_name":"teams","stream_namespace":"public","cursor_field":[]}}]}}]],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@10093fc5[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@120b7363[stream=io.airbyte.protocol.models.AirbyteStream@7eb53c30[name=deployments,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"dtype":{"type":"string"},"state":{"type":"string"},"creator":{"type":"number","airbyte_type":"integer"},"project_id":{"type":"number","airbyte_type":"integer"},"_ab_cdc_lsn":{"type":"number"},"creation_ts":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"instance_name":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"need_backend_info_refresh":{"type":"boolean"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@67854b91[stream=io.airbyte.protocol.models.AirbyteStream@3e2bb7b3[name=teams,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"slug":{"type":"string"},"creator":{"type":"number","airbyte_type":"integer"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@77f25b7e[stream=io.airbyte.protocol.models.AirbyteStream@2071ad0c[name=projects,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"slug":{"type":"string"},"deleted":{"type":"boolean"},"team_id":{"type":"number","airbyte_type":"integer"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@6d65b350[stream=io.airbyte.protocol.models.AirbyteStream@e101760[name=prewarmed_instances,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"version":{"type":"string"},"db_cluster":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"db_password":{"type":"string"},"instance_secret":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]] 2023-01-13 23:53:37 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):214 - Sync summary length: 7291 2023-01-13 23:53:37 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating... 2023-01-13 23:53:37 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to get state 2023-01-13 23:53:37 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to create or update state 2023-01-13 23:53:37 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 0 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]} 2023-01-13 23:53:38 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 1 to create or update state 2023-01-13 23:53:38 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 1 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]} 2023-01-13 23:53:45 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 2 to create or update state 2023-01-13 23:53:45 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 2 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]} 2023-01-14 00:03:45 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 3 to create or update state 2023-01-14 00:03:45 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 3 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]} 2023-01-14 00:03:47 WARN i.t.i.w.ActivityWorker$TaskHandlerImpl(logExceptionDuringResultReporting):365 - Failure during reporting of activity result to the server. ActivityId = 97afff30-ff77-3e16-9c99-344162ace86c, ActivityType = Persist, WorkflowId=sync_1, WorkflowType=SyncWorkflow, RunId=fc7fbd1d-4a6b-4b84-91db-dcd858907474 io.grpc.StatusRuntimeException: NOT_FOUND: workflow execution already completed at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.51.1.jar:1.51.1] at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.51.1.jar:1.51.1] at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.51.1.jar:1.51.1] at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.respondActivityTaskCompleted(WorkflowServiceGrpc.java:3840) ~[temporal-serviceclient-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.lambda$sendReply$0(ActivityWorker.java:303) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.retryer.GrpcRetryer.lambda$retry$0(GrpcRetryer.java:52) ~[temporal-serviceclient-1.17.0.jar:?] at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:67) ~[temporal-serviceclient-1.17.0.jar:?] at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:60) ~[temporal-serviceclient-1.17.0.jar:?] at io.temporal.internal.retryer.GrpcRetryer.retry(GrpcRetryer.java:50) ~[temporal-serviceclient-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.sendReply(ActivityWorker.java:298) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:252) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] 2023-01-14 00:03:37 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed. errors: $.method: must be a constant value Standard 2023-01-14 00:03:37 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/1/1/logs.log 2023-01-14 00:03:37 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: dev 2023-01-14 00:03:37 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation 2023-01-14 00:03:37 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:37 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK ----- 2023-01-14 00:03:37 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:37 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-postgres:1.0.36 exists... 2023-01-14 00:03:37 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-postgres:1.0.36 was found locally. 2023-01-14 00:03:37 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = source-postgres-check-1-1-levtn with resources io.airbyte.config.ResourceRequirements@2a28db0e[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-14 00:03:37 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/1 --log-driver none --name source-postgres-check-1-1-levtn --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:1.0.36 -e WORKER_JOB_ATTEMPT=1 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=dev -e WORKER_JOB_ID=1 airbyte/source-postgres:1.0.36 check --config source_config.json 2023-01-14 00:03:37 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):102 - Reading messages from protocol version 0.2.0 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Running source under deployment mode: OSS 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - integration args: {check=null, config=source_config.json} 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Command: CHECK 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'} 2023-01-14 00:03:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):164 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-14 00:03:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):164 - Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-14 00:03:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):164 - Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Starting connection with method: NO_TUNNEL 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - using CDC: true 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - HikariPool-1 - Starting... 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - HikariPool-1 - Start completed. 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - using CDC: true 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Attempting to get metadata from the database to see if we can connect. 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@2005706991 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot' AND plugin = 'pgoutput' AND database = 'convex' 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Set initial fetch size: 10 rows 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Attempting to find the publication using the query: HikariProxyPreparedStatement@1821010113 wrapping SELECT * FROM pg_publication WHERE pubname = 'airbyte_publication' 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Set initial fetch size: 10 rows 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - HikariPool-1 - Shutdown initiated... 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - HikariPool-1 - Shutdown completed. 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):167 - Completed source: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:38 INFO i.a.w.g.DefaultCheckConnectionWorker(run):110 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@7741d626[status=succeeded,message=] 2023-01-14 00:03:38 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK ----- 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:38 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/1/1/logs.log 2023-01-14 00:03:38 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: dev 2023-01-14 00:03:38 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK ----- 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-convex:dev exists... 2023-01-14 00:03:38 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-convex:dev was found locally. 2023-01-14 00:03:38 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = destination-convex-check-1-1-theem with resources io.airbyte.config.ResourceRequirements@2a28db0e[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-14 00:03:38 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/1 --log-driver none --name destination-convex-check-1-1-theem --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/destination-convex:dev -e WORKER_JOB_ATTEMPT=1 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=dev -e WORKER_JOB_ID=1 airbyte/destination-convex:dev check --config source_config.json 2023-01-14 00:03:38 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):102 - Reading messages from protocol version 0.2.0 2023-01-14 00:03:39 INFO i.a.w.g.DefaultCheckConnectionWorker(run):110 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@7aad4997[status=succeeded,message=] 2023-01-14 00:03:39 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-01-14 00:03:39 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:39 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK ----- 2023-01-14 00:03:39 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:47 INFO i.a.w.t.TemporalAttemptExecution(get):136 - Docker volume job log path: /tmp/workspace/1/1/logs.log 2023-01-14 00:03:47 INFO i.a.w.t.TemporalAttemptExecution(get):141 - Executing worker wrapper. Airbyte version: dev 2023-01-14 00:03:47 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to save workflow id for cancellation 2023-01-14 00:03:47 INFO i.a.c.EnvConfigs(getEnvOrDefault):1173 - Using default value for environment variable METRIC_CLIENT: '' 2023-01-14 00:03:47 WARN i.a.m.l.MetricClientFactory(initialize):60 - Metric client is already initialized to 2023-01-14 00:03:47 INFO i.a.w.g.DefaultReplicationWorker(run):150 - start sync worker. job id: 1 attempt id: 1 2023-01-14 00:03:47 INFO i.a.w.g.DefaultReplicationWorker(run):165 - configured sync modes: {public.deployments=incremental - append_dedup, public.projects=incremental - append_dedup, public.teams=incremental - append_dedup, public.prewarmed_instances=incremental - append_dedup} 2023-01-14 00:03:47 INFO i.a.w.i.DefaultAirbyteDestination(start):78 - Running destination... 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START REPLICATION ----- 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/destination-convex:dev exists... 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/destination-convex:dev was found locally. 2023-01-14 00:03:47 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = destination-convex-write-1-1-pbtbd with resources io.airbyte.config.ResourceRequirements@22ef9b5a[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-14 00:03:47 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/1 --log-driver none --name destination-convex-write-1-1-pbtbd --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/destination-convex:dev -e WORKER_JOB_ATTEMPT=1 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=dev -e WORKER_JOB_ID=1 airbyte/destination-convex:dev write --config destination_config.json --catalog destination_catalog.json 2023-01-14 00:03:47 INFO i.a.w.i.VersionedAirbyteMessageBufferedWriterFactory(createWriter):33 - Writing messages to protocol version 0.2.0 2023-01-14 00:03:47 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):102 - Reading messages from protocol version 0.2.0 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-postgres:1.0.36 exists... 2023-01-14 00:03:47 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-postgres:1.0.36 was found locally. 2023-01-14 00:03:47 INFO i.a.w.p.DockerProcessFactory(create):120 - Creating docker container = source-postgres-read-1-1-xvnyt with resources io.airbyte.config.ResourceRequirements@2ea19ae2[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=] 2023-01-14 00:03:47 INFO i.a.w.p.DockerProcessFactory(create):164 - Preparing command: docker run --rm --init -i -w /data/1/1 --log-driver none --name source-postgres-read-1-1-xvnyt --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e FIELD_SELECTION_WORKSPACES= -e USE_STREAM_CAPABLE_STATE=true -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_CONNECTOR_IMAGE=airbyte/source-postgres:1.0.36 -e WORKER_JOB_ATTEMPT=1 -e AUTO_DETECT_SCHEMA=false -e AIRBYTE_VERSION=dev -e WORKER_JOB_ID=1 airbyte/source-postgres:1.0.36 read --config source_config.json --catalog source_catalog.json 2023-01-14 00:03:47 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):102 - Reading messages from protocol version 0.2.0 2023-01-14 00:03:47 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):273 - Destination output thread started. 2023-01-14 00:03:47 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):335 - Replication thread started. 2023-01-14 00:03:47 INFO i.a.w.g.DefaultReplicationWorker(replicate):245 - Waiting for source and destination threads to complete. 2023-01-14 00:03:48 source > Running source under deployment mode: OSS 2023-01-14 00:03:48 source > Starting source: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:48 source > integration args: {read=null, catalog=source_catalog.json, config=source_config.json} 2023-01-14 00:03:48 source > Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:48 source > Command: READ 2023-01-14 00:03:48 source > Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'} 2023-01-14 00:03:48 destination > Begin writing to the destination... 2023-01-14 00:03:48 source > Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-14 00:03:48 source > Unknown keyword min - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-14 00:03:48 source > Unknown keyword max - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword 2023-01-14 00:03:48 source > Starting connection with method: NO_TUNNEL 2023-01-14 00:03:48 source > using CDC: true 2023-01-14 00:03:48 source > HikariPool-1 - Starting... 2023-01-14 00:03:48 source > HikariPool-1 - Start completed. 2023-01-14 00:03:48 source > using CDC: true 2023-01-14 00:03:48 source > Attempting to get metadata from the database to see if we can connect. 2023-01-14 00:03:48 source > Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@151515831 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot' AND plugin = 'pgoutput' AND database = 'convex' 2023-01-14 00:03:48 source > Set initial fetch size: 10 rows 2023-01-14 00:03:48 source > Attempting to find the publication using the query: HikariProxyPreparedStatement@327840833 wrapping SELECT * FROM pg_publication WHERE pubname = 'airbyte_publication' 2023-01-14 00:03:48 source > Set initial fetch size: 10 rows 2023-01-14 00:03:48 source > HikariPool-1 - Shutdown initiated... 2023-01-14 00:03:48 source > HikariPool-1 - Shutdown completed. 2023-01-14 00:03:48 source > using CDC: true 2023-01-14 00:03:48 source > using CDC: true 2023-01-14 00:03:48 source > Global state manager selected to manage state object with type GLOBAL. 2023-01-14 00:03:48 source > No cursor field set in catalog but not present in state. Stream: public_prewarmed_instances, New Cursor Field: null. Resetting cursor value 2023-01-14 00:03:48 source > No cursor field set in catalog but not present in state. Stream: public_projects, New Cursor Field: null. Resetting cursor value 2023-01-14 00:03:48 source > No cursor field set in catalog but not present in state. Stream: public_deployments, New Cursor Field: null. Resetting cursor value 2023-01-14 00:03:48 source > No cursor field set in catalog but not present in state. Stream: public_teams, New Cursor Field: null. Resetting cursor value 2023-01-14 00:03:48 source > Initialized CDC state with: io.airbyte.integrations.source.relationaldb.models.CdcState@30b9eadd[state=,additionalProperties={}] 2023-01-14 00:03:48 source > HikariPool-2 - Starting... 2023-01-14 00:03:48 source > HikariPool-2 - Start completed. 2023-01-14 00:03:48 source > Checking schema: public 2023-01-14 00:03:48 source > Internal schemas to exclude: [catalog_history, information_schema, pg_catalog, pg_internal] 2023-01-14 00:03:48 source > Set initial fetch size: 10 rows 2023-01-14 00:03:48 source > Max memory limit: 9403629568, JDBC buffer size: 1073741824 2023-01-14 00:03:48 source > Table promotion_redemptions column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotion_redemptions column promotion_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotion_redemptions column actor (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotion_redemptions column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotion_redemptions column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotion_redemptions column redemption_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table promotion_redemptions column expiration_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table promotion_redemptions column expired (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table invitations column code (type uuid[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table invitations column email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table invitations column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table invitations column issued (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table invitations column expiration (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table invitations column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table plans column id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plans column friendly_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table members column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table members column email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table members column tos_accept_version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table members column tos_accept_timestamp (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table members column auth0_subject (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table members column auth0_email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table current_backend_version_journal column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table current_backend_version_journal column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table current_backend_version_journal column ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-14 00:03:48 source > Table promotions column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotions column start_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table promotions column end_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table promotions column code (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table promotions column entitlement_duration_sec (type int4[10], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotions column team_promotion (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table old_instances column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table old_instances column db_password (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table old_instances column instance_secret (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table old_instances column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table old_instances column project_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table old_instances column active (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table old_instances column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table old_instances column db_cluster (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table old_instances column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table old_instances column creation_ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-14 00:03:48 source > Table deployments column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table deployments column project_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table deployments column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table deployments column state (type deployment_state[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table deployments column dtype (type deployment_type[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table deployments column instance_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table deployments column creation_ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-14 00:03:48 source > Table deployments column need_backend_info_refresh (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table plan_subscription_log column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table plan_subscription_log column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table plan_subscription_log column plan_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plan_subscription_log column ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table plan_subscription_log column stripe_subscription_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plan_subscription_log column stripe_customer_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plan_subscription_log column stripe_event_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plan_subscription_log column reason (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table entitlement_grants column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table entitlement_grants column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table entitlement_grants column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table entitlement_grants column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table entitlement_grants column creation_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table entitlement_grants column expiration_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table entitlement_grants column expired (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table entitlement_grants column value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table entitlement_grants column operator (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table entitlement_grants column reason (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table promotion_entitlements column promotion_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table promotion_entitlements column entitlement_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table promotion_entitlements column entitlement_value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table teams column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table teams column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table teams column slug (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table teams column creator (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table deprecated_beta_keys column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table deprecated_beta_keys column body (type uuid[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table deprecated_beta_keys column email (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column db_password (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column instance_secret (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column db_cluster (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column state (type instance_state[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table instances column modification_ts (type timestamp[29], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_without_timezone}) 2023-01-14 00:03:48 source > Table db_clusters column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table db_clusters column db_driver (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table db_clusters column url (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table db_clusters column weight (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table db_clusters column replicas (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table authorized_devices column token (type uuid[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table authorized_devices column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table authorized_devices column device_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table authorized_devices column creation_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table team_member column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table team_member column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table projects column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table projects column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table projects column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table projects column slug (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table projects column deleted (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table plan_entitlements column plan_id (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plan_entitlements column entitlement_name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table plan_entitlements column entitlement_value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table prewarmed_instances column id (type bigserial[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table prewarmed_instances column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table prewarmed_instances column db_password (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table prewarmed_instances column instance_secret (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table prewarmed_instances column version (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table prewarmed_instances column db_cluster (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table entitlements_snapshot column name (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table entitlements_snapshot column team_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table entitlements_snapshot column member_id (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table entitlements_snapshot column update_ts (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table entitlements_snapshot column value (type jsonb[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table _sqlx_migrations column version (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Table _sqlx_migrations column description (type text[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table _sqlx_migrations column installed_on (type timestamptz[35], nullable false) -> JsonSchemaType({type=string, format=date-time, airbyte_type=timestamp_with_timezone}) 2023-01-14 00:03:48 source > Table _sqlx_migrations column success (type bool[1], nullable false) -> JsonSchemaType({type=boolean}) 2023-01-14 00:03:48 source > Table _sqlx_migrations column checksum (type bytea[2147483647], nullable false) -> JsonSchemaType({type=string}) 2023-01-14 00:03:48 source > Table _sqlx_migrations column execution_time (type int8[19], nullable false) -> JsonSchemaType({type=number, airbyte_type=integer}) 2023-01-14 00:03:48 source > Found table: public.promotion_redemptions 2023-01-14 00:03:48 source > Found table: public.invitations 2023-01-14 00:03:48 source > Found table: public.plans 2023-01-14 00:03:48 source > Found table: public.members 2023-01-14 00:03:48 source > Found table: public.current_backend_version_journal 2023-01-14 00:03:48 source > Found table: public.promotions 2023-01-14 00:03:48 source > Found table: public.old_instances 2023-01-14 00:03:48 source > Found table: public.deployments 2023-01-14 00:03:48 source > Found table: public.plan_subscription_log 2023-01-14 00:03:48 source > Found table: public.entitlement_grants 2023-01-14 00:03:48 source > Found table: public.promotion_entitlements 2023-01-14 00:03:48 source > Found table: public.teams 2023-01-14 00:03:48 source > Found table: public.deprecated_beta_keys 2023-01-14 00:03:48 source > Found table: public.instances 2023-01-14 00:03:48 source > Found table: public.db_clusters 2023-01-14 00:03:48 source > Found table: public.authorized_devices 2023-01-14 00:03:48 source > Found table: public.team_member 2023-01-14 00:03:48 source > Found table: public.projects 2023-01-14 00:03:48 source > Found table: public.plan_entitlements 2023-01-14 00:03:48 source > Found table: public.prewarmed_instances 2023-01-14 00:03:48 source > Found table: public.entitlements_snapshot 2023-01-14 00:03:48 source > Found table: public._sqlx_migrations 2023-01-14 00:03:48 source > using CDC: true 2023-01-14 00:03:48 source > Set initial fetch size: 10 rows 2023-01-14 00:03:48 source > For CDC, only tables in publication airbyte_publication will be included in the sync: [public.prewarmed_instances, public.projects, public.teams, public.deployments] 2023-01-14 00:03:48 source > using CDC: true 2023-01-14 00:03:48 source > First record waiting time: 300 seconds 2023-01-14 00:03:48 source > First record waiting time: 300 seconds 2023-01-14 00:03:48 source > Should flush after sync: true 2023-01-14 00:03:48 source > StandaloneConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [localhost:9092] client.dns.lookup = use_all_dns_ips config.providers = [] connector.client.config.override.policy = All header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter key.converter = class org.apache.kafka.connect.json.JsonConverter listeners = [http://:8083] metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 1000 offset.flush.timeout.ms = 5000 offset.storage.file.filename = /tmp/cdc-state-offset606042491209816197/offset.dat plugin.path = null response.http.headers.config = rest.advertised.host.name = null rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS task.shutdown.graceful.timeout.ms = 5000 topic.creation.enable = true topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class org.apache.kafka.connect.json.JsonConverter 2023-01-14 00:03:48 source > Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2023-01-14 00:03:48 source > Starting FileOffsetBackingStore with file /tmp/cdc-state-offset606042491209816197/offset.dat 2023-01-14 00:03:48 source > JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = true 2023-01-14 00:03:48 source > JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-14 00:03:48 source > No previous offsets found 2023-01-14 00:03:48 source > Closing offsetStorageReader and fileOffsetBackingStore 2023-01-14 00:03:48 source > Stopped FileOffsetBackingStore 2023-01-14 00:03:48 source > Attempting to find the named replication slot using the query: HikariProxyPreparedStatement@1536478396 wrapping SELECT * FROM pg_replication_slots WHERE slot_name = 'airbyte_slot' AND plugin = 'pgoutput' AND database = 'convex' 2023-01-14 00:03:48 source > Set initial fetch size: 10 rows 2023-01-14 00:03:48 source > Should flush after sync: true 2023-01-14 00:03:49 source > identified target lsn: PgLsn{lsn=24248789040} 2023-01-14 00:03:49 source > Should flush after sync: true 2023-01-14 00:03:49 source > Using CDC: true 2023-01-14 00:03:49 source > JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = true 2023-01-14 00:03:49 source > JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-14 00:03:49 source > EmbeddedConfig values: access.control.allow.methods = access.control.allow.origin = admin.listeners = null bootstrap.servers = [localhost:9092] client.dns.lookup = use_all_dns_ips config.providers = [] connector.client.config.override.policy = All header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter key.converter = class org.apache.kafka.connect.json.JsonConverter listeners = [http://:8083] metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 1000 offset.flush.timeout.ms = 5000 offset.storage.file.filename = /tmp/cdc-state-offset16632706261773842656/offset.dat offset.storage.partitions = null offset.storage.replication.factor = null offset.storage.topic = plugin.path = null response.http.headers.config = rest.advertised.host.name = null rest.advertised.listener = null rest.advertised.port = null rest.extension.classes = [] ssl.cipher.suites = null ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS task.shutdown.graceful.timeout.ms = 5000 topic.creation.enable = true topic.tracking.allow.reset = true topic.tracking.enable = true value.converter = class org.apache.kafka.connect.json.JsonConverter 2023-01-14 00:03:49 source > The worker has been configured with one or more internal converter properties ([internal.key.converter, internal.value.converter]). Support for these properties was deprecated in version 2.0 and removed in version 3.0, and specifying them will have no effect. Instead, an instance of the JsonConverter with schemas.enable set to false will be used. For more information, please visit http://kafka.apache.org/documentation/#upgrade and consult the upgrade notesfor the 3.0 release. 2023-01-14 00:03:49 source > Variables cannot be used in the 'plugin.path' property, since the property is used by plugin scanning before the config providers that replace the variables are initialized. The raw value 'null' was used for plugin scanning, as opposed to the transformed value 'null', and this may cause unexpected results. 2023-01-14 00:03:49 source > JsonConverterConfig values: converter.type = key decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-14 00:03:49 source > JsonConverterConfig values: converter.type = value decimal.format = BASE64 schemas.cache.size = 1000 schemas.enable = false 2023-01-14 00:03:49 source > Starting FileOffsetBackingStore with file /tmp/cdc-state-offset16632706261773842656/offset.dat 2023-01-14 00:03:49 source > Configuration property 'truncate.handling.mode' is deprecated and will be removed soon. If you wish to retain skipped truncate functionality, please configure 'skipped.operations' with "t". 2023-01-14 00:03:49 source > Property 'flush.lsn.source' is set to 'false', the LSN will not be flushed to the database source and WAL logs will not be cleared. User is expected to handle this outside Debezium. 2023-01-14 00:03:49 source > Configuration property 'truncate.handling.mode' is deprecated and will be removed in future versions. Please use 'skipped.operations' instead. 2023-01-14 00:03:49 source > Configuration property 'toasted.value.placeholder' is deprecated and will be removed in future versions. Please use 'unavailable.value.placeholder' instead. 2023-01-14 00:03:49 source > Starting PostgresConnectorTask with configuration: 2023-01-14 00:03:49 source > connector.class = io.debezium.connector.postgresql.PostgresConnector 2023-01-14 00:03:49 source > max.queue.size = 8192 2023-01-14 00:03:49 source > slot.name = airbyte_slot 2023-01-14 00:03:49 source > publication.name = airbyte_publication 2023-01-14 00:03:49 source > offset.storage.file.filename = /tmp/cdc-state-offset16632706261773842656/offset.dat 2023-01-14 00:03:49 source > decimal.handling.mode = string 2023-01-14 00:03:49 source > flush.lsn.source = false 2023-01-14 00:03:49 source > converters = datetime 2023-01-14 00:03:49 source > datetime.type = io.airbyte.integrations.debezium.internals.PostgresConverter 2023-01-14 00:03:49 source > value.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-14 00:03:49 source > key.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-14 00:03:49 source > publication.autocreate.mode = disabled 2023-01-14 00:03:49 source > database.user = emmaling 2023-01-14 00:03:49 source > database.dbname = convex 2023-01-14 00:03:49 source > offset.storage = org.apache.kafka.connect.storage.FileOffsetBackingStore 2023-01-14 00:03:49 source > database.server.name = convex 2023-01-14 00:03:49 source > offset.flush.timeout.ms = 5000 2023-01-14 00:03:49 source > heartbeat.interval.ms = 10000 2023-01-14 00:03:49 source > column.include.list = \Qpublic.deployments\E\.(\Qid\E|\Qdtype\E|\Qstate\E|\Qcreator\E|\Qproject_id\E|\Q_ab_cdc_lsn\E|\Qcreation_ts\E|\Qinstance_name\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E|\Qneed_backend_info_refresh\E),\Qpublic.teams\E\.(\Qid\E|\Qname\E|\Qslug\E|\Qcreator\E|\Q_ab_cdc_lsn\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E),\Qpublic.projects\E\.(\Qid\E|\Qname\E|\Qslug\E|\Qdeleted\E|\Qteam_id\E|\Q_ab_cdc_lsn\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E),\Qpublic.prewarmed_instances\E\.(\Qid\E|\Qname\E|\Qversion\E|\Qdb_cluster\E|\Q_ab_cdc_lsn\E|\Qdb_password\E|\Qinstance_secret\E|\Q_ab_cdc_deleted_at\E|\Q_ab_cdc_updated_at\E) 2023-01-14 00:03:49 source > plugin.name = pgoutput 2023-01-14 00:03:49 source > database.port = 5432 2023-01-14 00:03:49 source > offset.flush.interval.ms = 1000 2023-01-14 00:03:49 source > key.converter.schemas.enable = false 2023-01-14 00:03:49 source > internal.key.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-14 00:03:49 source > include.unknown.datatypes = true 2023-01-14 00:03:49 source > database.hostname = host.docker.internal 2023-01-14 00:03:49 source > name = convex 2023-01-14 00:03:49 source > value.converter.schemas.enable = false 2023-01-14 00:03:49 source > internal.value.converter = org.apache.kafka.connect.json.JsonConverter 2023-01-14 00:03:49 source > max.batch.size = 2048 2023-01-14 00:03:49 source > table.include.list = \Qpublic.deployments\E,\Qpublic.teams\E,\Qpublic.projects\E,\Qpublic.prewarmed_instances\E 2023-01-14 00:03:49 source > snapshot.mode = initial 2023-01-14 00:03:49 source > Connection gracefully closed 2023-01-14 00:03:49 source > No previous offsets found 2023-01-14 00:03:49 source > user 'emmaling' connected to database 'convex' on PostgreSQL 14.6 (Homebrew) on aarch64-apple-darwin22.1.0, compiled by Apple clang version 14.0.0 (clang-1400.0.29.202), 64-bit with roles: role 'impolite_fish_831' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'peaceful_horse_578' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_knowing_partridge_253' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_formal_llama_678' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_outgoing_chimpanzee_418' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fleet_eagle_631' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_brave_cormorant_724' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'adventurous_heron_271' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_little_dunlin_209' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'secret_hare_976' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'judicious_lapwing_404' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'clear_kookabura_918' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_coordinated_goose_728' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'third_sheep_819' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'merry_tiger_926' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_fine_flamingo_545' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'optimistic_quelea_685' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'lazy_lobster_517' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'enchanted_pony_474' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'pg_read_all_stats' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] role 'kindred_leopard_137' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_zany_monkey_512' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'helpful_dunlin_171' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'late_louse_209' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_jaded_caribou_306' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'lovable_hawk_746' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'agile_quail_902' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'false_gazelle_324' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_nautical_antelope_43' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'marvelous_reindeer_171' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_imperfect_cassowary_499' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_flippant_capybara_855' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'innocent_worm_737' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'uncommon_crow_369' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'wandering_bee_475' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fiery_kangaroo_814' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'superb_spider_100' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_colorless_finch_234' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_woozy_snail_615' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_giddy_crow_472' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fortunate_elk_646' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'standing_echidna_736' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_brave_beaver_478' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_cagey_turtle_651' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'dapper_grasshopper_973' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'fortunate_curlew_905' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'jovial_shrew_827' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'hushed_mandrill_786' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_dashing_antelope_221' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'formal_kookabura_682' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'bizarre_coyote_440' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'judicious_llama_288' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'beloved_porcupine_444' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'outgoing_leopard_800' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_avid_nightingale_29' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_canny_cobra_785' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_keen_gnu_542' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'glad_pig_33' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_third_wolverine_683' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'uncommon_elephant_829' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'abundant_hedgehog_159' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'standing_capybara_955' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'famous_scorpion_641' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_next_guanaco_86' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_tangible_kingfisher_906' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_tangible_dunlin_929' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'rosy_zebra_28' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_rapid_lobster_434' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'obsolete_gnu_171' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'charming_porpoise_845' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'trustworthy_chicken_266' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'standing_frog_145' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'pg_read_all_data' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: false] role 'test_shocking_bat_673' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_avid_raven_375' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'mild_leopard_957' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_resolute_mouse_303' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'bleak_dragonfly_28' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'vivid_sandpiper_368' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_kindly_crab_276' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_tricky_gnat_993' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_quick_dotterel_122' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'whimsical_otter_157' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'dynamic_hamster_90' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'charming_sheep_976' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'knowing_capybara_249' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_descriptive_coyote_8' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'lovable_mouse_933' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'nautical_meerkat_841' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'bright_chinchilla_839' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_admired_shark_340' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'intent_tarsier_75' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_naive_porcupine_95' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_robust_chinchilla_208' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_childlike_wolverine_269_dev' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'uncommon_herring_177' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'quiet_seahorse_147' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'doubtful_goshawk_423' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'silent_mallard_368' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'compassionate_albatross_711' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_vibrant_anteater_455' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'brainy_boar_354' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'pastel_woodpecker_988' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'canny_dragonfly_581' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'superb_albatross_558' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'reminiscent_lobster_88' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_disagreeable_stingray_78' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_confused_quail_841' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_naive_rat_268' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_different_chicken_55_dev' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_opulent_giraffe_226' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'dusty_meerkat_32' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_shiny_chough_534' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'test_woozy_eagle_243' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'upbeat_locust_160' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'mild_hawk_285' [superuser: false, replication: false, inherit: true, create role: false, create db: false, can log in: true] role 'focused_lark_318' [superuser: false, replicat… 2023-01-14 00:03:49 source > Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{5/A3E3A208}, catalogXmin=111632588] 2023-01-14 00:03:49 source > No previous offset found 2023-01-14 00:03:49 source > Taking initial snapshot for new datasource 2023-01-14 00:03:49 source > Requested thread factory for connector PostgresConnector, id = convex named = change-event-source-coordinator 2023-01-14 00:03:49 source > Creating thread debezium-postgresconnector-convex-change-event-source-coordinator 2023-01-14 00:03:49 source > Metrics registered 2023-01-14 00:03:49 source > Context created 2023-01-14 00:03:49 source > Taking initial snapshot for new datasource 2023-01-14 00:03:49 source > According to the connector configuration data will be snapshotted 2023-01-14 00:03:49 source > Snapshot step 1 - Preparing 2023-01-14 00:03:49 source > Snapshot step 2 - Determining captured tables 2023-01-14 00:03:49 source > Adding table public.deployments to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.promotion_redemptions to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.promotions to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.promotion_entitlements to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.plan_entitlements to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.old_instances to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public._sqlx_migrations to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.invitations to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.prewarmed_instances to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.team_member to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.authorized_devices to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.entitlement_grants to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.projects to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.deprecated_beta_keys to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.members to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.current_backend_version_journal to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.entitlements_snapshot to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.instances to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.db_clusters to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.teams to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.plan_subscription_log to the list of capture schema tables 2023-01-14 00:03:49 source > Adding table public.plans to the list of capture schema tables 2023-01-14 00:03:49 source > Snapshot step 3 - Locking captured tables [public.deployments, public.teams, public.projects, public.prewarmed_instances] 2023-01-14 00:03:49 source > Snapshot step 4 - Determining snapshot offset 2023-01-14 00:03:49 source > Creating initial offset context 2023-01-14 00:03:49 source > Read xlogStart at 'LSN{5/A5572830}' from transaction '111644811' 2023-01-14 00:03:49 source > Read xlogStart at 'LSN{5/A5572830}' from transaction '111644811' 2023-01-14 00:03:49 source > Snapshot step 5 - Reading structure of captured tables 2023-01-14 00:03:49 source > Reading structure of schema 'public' of catalog 'convex' 2023-01-14 00:03:49 source > Snapshot step 6 - Persisting schema history 2023-01-14 00:03:49 source > Snapshot step 7 - Snapshotting data 2023-01-14 00:03:49 source > Snapshotting contents of 4 tables while still in transaction 2023-01-14 00:03:49 source > Exporting data from table 'public.deployments' (1 of 4 tables) 2023-01-14 00:03:49 source > For table 'public.deployments' using select statement: 'SELECT "id", "project_id", "creator", "state", "dtype", "instance_name", "creation_ts", "need_backend_info_refresh" FROM "public"."deployments"' 2023-01-14 00:03:49 source > Finished exporting 5 records for table 'public.deployments'; total duration '00:00:00.018' 2023-01-14 00:03:49 source > Exporting data from table 'public.teams' (2 of 4 tables) 2023-01-14 00:03:49 source > For table 'public.teams' using select statement: 'SELECT "id", "name", "slug", "creator" FROM "public"."teams"' 2023-01-14 00:03:49 source > Finished exporting 1 records for table 'public.teams'; total duration '00:00:00.014' 2023-01-14 00:03:49 source > Exporting data from table 'public.projects' (3 of 4 tables) 2023-01-14 00:03:49 source > For table 'public.projects' using select statement: 'SELECT "id", "name", "team_id", "slug", "deleted" FROM "public"."projects"' 2023-01-14 00:03:49 source > Finished exporting 5 records for table 'public.projects'; total duration '00:00:00.014' 2023-01-14 00:03:49 source > Exporting data from table 'public.prewarmed_instances' (4 of 4 tables) 2023-01-14 00:03:49 source > For table 'public.prewarmed_instances' using select statement: 'SELECT "id", "name", "db_password", "instance_secret", "version", "db_cluster" FROM "public"."prewarmed_instances"' 2023-01-14 00:03:49 source > Finished exporting 0 records for table 'public.prewarmed_instances'; total duration '00:00:00.003' 2023-01-14 00:03:49 source > Snapshot - Final stage 2023-01-14 00:03:49 source > Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.postgresql.Source:STRUCT}, sourceInfo=source_info[server='convex'db='convex', lsn=LSN{5/A5572830}, txId=111644811, timestamp=2023-01-14T00:03:49.184412Z, snapshot=FALSE, schema=public, table=projects], lastSnapshotRecord=true, lastCompletelyProcessedLsn=null, lastCommitLsn=null, streamingStoppingLsn=null, transactionContext=TransactionContext [currentTransactionId=null, perTableEventCount={}, totalEventCount=0], incrementalSnapshotContext=IncrementalSnapshotContext [windowOpened=false, chunkEndPosition=null, dataCollectionsToSnapshot=[], lastEventKeySent=null, maximumKey=null]]] 2023-01-14 00:03:49 source > Connected metrics set to 'true' 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.deployments' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.projects' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.prewarmed_instances' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.teams' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > Starting streaming 2023-01-14 00:03:49 source > Retrieved latest position from stored offset 'LSN{5/A5572830}' 2023-01-14 00:03:49 source > Looking for WAL restart position for last commit LSN 'null' and last change LSN 'LSN{5/A5572830}' 2023-01-14 00:03:49 source > Initializing PgOutput logical decoder publication 2023-01-14 00:03:49 source > Obtained valid replication slot ReplicationSlot [active=false, latestFlushedLsn=LSN{5/A3E3A208}, catalogXmin=111632588] 2023-01-14 00:03:49 source > Connection gracefully closed 2023-01-14 00:03:49 source > Requested thread factory for connector PostgresConnector, id = convex named = keep-alive 2023-01-14 00:03:49 source > Creating thread debezium-postgresconnector-convex-keep-alive 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.deployments' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.projects' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.prewarmed_instances' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > REPLICA IDENTITY for 'public.teams' is 'DEFAULT'; UPDATE and DELETE events will contain previous values only for PK columns 2023-01-14 00:03:49 source > Searching for WAL resume position 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2022-12-19T22:43:41.719905 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2022-12-19T22:43:41.719905 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2022-12-19T22:43:41.719905 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-06T18:20:09.594287 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-06T18:20:09.594287 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-06T18:20:09.594287 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T20:56:14.138516 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T20:56:14.138516 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T20:56:14.138516 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T21:01:21.983729 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T21:01:21.983729 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-10T21:01:21.983729 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-12T00:20:51.605565 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-12T00:20:51.605565 2023-01-14 00:03:49 ERROR c.n.s.DateTimeValidator(tryParse):82 - Invalid date-time: No timezone information: 2023-01-12T00:20:51.605565 2023-01-14 00:03:49 source > Signalling close because Snapshot is complete 2023-01-14 00:03:49 source > Closing: Change event reached target position 2023-01-14 00:03:49 source > Stopping the embedded engine 2023-01-14 00:03:49 source > Waiting for PT5M for connector to stop 2023-01-14 00:03:50 source > Stopping the task and engine 2023-01-14 00:03:50 source > Stopping down connector 2023-01-14 00:03:50 source > WAL resume position 'null' discovered 2023-01-14 00:03:50 source > Connection gracefully closed 2023-01-14 00:03:50 source > Connection gracefully closed 2023-01-14 00:03:50 source > Initializing PgOutput logical decoder publication 2023-01-14 00:03:50 source > Requested thread factory for connector PostgresConnector, id = convex named = keep-alive 2023-01-14 00:03:50 source > Creating thread debezium-postgresconnector-convex-keep-alive 2023-01-14 00:03:50 source > Processing messages 2023-01-14 00:03:50 source > Connection gracefully closed 2023-01-14 00:03:50 source > Connection gracefully closed 2023-01-14 00:03:50 source > Finished streaming 2023-01-14 00:03:50 source > Connected metrics set to 'false' 2023-01-14 00:03:50 source > Stopped FileOffsetBackingStore 2023-01-14 00:03:50 source > Debezium engine shutdown. 2023-01-14 00:03:50 source > Closing: Heartbeat indicates sync is done 2023-01-14 00:03:50 source > Closing: Iterator closing 2023-01-14 00:03:50 source > debezium state: {"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248789040,\"txId\":111644811,\"ts_usec\":1673654629184412}"} 2023-01-14 00:03:50 source > Closing database connection pool. 2023-01-14 00:03:50 source > HikariPool-2 - Shutdown initiated... 2023-01-14 00:03:50 source > HikariPool-2 - Shutdown completed. 2023-01-14 00:03:50 source > Closed database connection pool. 2023-01-14 00:03:50 source > Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:50 source > Completed source: io.airbyte.integrations.base.ssh.SshWrappedSource 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):383 - Source has no more messages, closing connection. 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):392 - Total records read: 12 (2 KB) 2023-01-14 00:03:50 WARN i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):395 - Schema validation errors found for stream public_projects. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2023-01-14 00:03:50 WARN i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):395 - Schema validation errors found for stream public_deployments. Error messages: [$.creation_ts is of an incorrect type. Expected it to be date-time, $._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2023-01-14 00:03:50 WARN i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$5):395 - Schema validation errors found for stream public_teams. Error messages: [$._ab_cdc_deleted_at is of an incorrect type. Expected it to be string] 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(replicate):250 - One of source or destination thread complete. Waiting on the other. 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromDstRunnable$4):284 - State in DefaultReplicationWorker from destination: io.airbyte.protocol.models.AirbyteMessage@5f99ef94[type=STATE,log=,spec=,connectionStatus=,catalog=,record=,state=io.airbyte.protocol.models.AirbyteStateMessage@515e0a17[type=GLOBAL,stream=,global=,data={"cdc":true,"cdc_state":{"state":{"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248789040,\"txId\":111644811,\"ts_usec\":1673654629184412}"}},"streams":[{"stream_name":"deployments","stream_namespace":"public","cursor_field":[]},{"stream_name":"prewarmed_instances","stream_namespace":"public","cursor_field":[]},{"stream_name":"projects","stream_namespace":"public","cursor_field":[]},{"stream_name":"teams","stream_namespace":"public","cursor_field":[]}]},additionalProperties={global_={shared_state={state={{"schema":null,"payload":["convex",{"server":"convex"}]}={"transaction_id":null,"lsn":24248789040,"txId":111644811,"ts_usec":1673654629184412}}}, stream_states=[{stream_descriptor={name=deployments, namespace=public}, stream_state={stream_namespace=public, stream_name=deployments, cursor_field=[]}}, {stream_descriptor={name=prewarmed_instances, namespace=public}, stream_state={stream_namespace=public, stream_name=prewarmed_instances, cursor_field=[]}}, {stream_descriptor={name=projects, namespace=public}, stream_state={stream_namespace=public, stream_name=projects, cursor_field=[]}}, {stream_descriptor={name=teams, namespace=public}, stream_state={stream_namespace=public, stream_name=teams, cursor_field=[]}}]}}],trace=,control=,additionalProperties={}] 2023-01-14 00:03:50 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):226 - The message tracker encountered an issue that prevents committed record counts from being reliably computed. 2023-01-14 00:03:50 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):227 - This only impacts metadata and does not indicate a problem with actual sync data. 2023-01-14 00:03:50 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):228 - Delta was not stored for state hash 1727035794 io.airbyte.workers.internal.book_keeping.StateDeltaTracker$StateDeltaTrackerException: Delta was not stored for state hash 1727035794 at io.airbyte.workers.internal.book_keeping.StateDeltaTracker.commitStateHash(StateDeltaTracker.java:126) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.handleDestinationEmittedState(AirbyteMessageTracker.java:223) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.acceptFromDestination(AirbyteMessageTracker.java:146) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:286) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] 2023-01-14 00:03:50 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):237 - The state message tracker was unable to match the destination state message to a corresponding source state message. 2023-01-14 00:03:50 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):238 - This only impacts metrics and does not indicate a problem with actual sync data. 2023-01-14 00:03:50 WARN i.a.w.i.b.AirbyteMessageTracker(handleDestinationEmittedState):239 - Destination state message cannot be matched to corresponding Source state message. io.airbyte.workers.internal.book_keeping.StateMetricsTracker$StateMetricsTrackerNoStateMatchException: Destination state message cannot be matched to corresponding Source state message. at io.airbyte.workers.internal.book_keeping.StateMetricsTracker.findStartingTimeStampAndRemoveOlderEntries(StateMetricsTracker.java:147) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.StateMetricsTracker.updateStates(StateMetricsTracker.java:82) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.handleDestinationEmittedState(AirbyteMessageTracker.java:234) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.internal.book_keeping.AirbyteMessageTracker.acceptFromDestination(AirbyteMessageTracker.java:146) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromDstRunnable$4(DefaultReplicationWorker.java:286) ~[io.airbyte-airbyte-commons-worker-0.40.28.jar:?] at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] 2023-01-14 00:03:50 destination > Writing complete. 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(replicate):252 - Source and destination threads complete. 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):550 - Source output at least one state message 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):556 - State capture: Updated state to: Optional[io.airbyte.config.State@75bf9dae[state=[{"type":"GLOBAL","global_":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248789040,\"txId\":111644811,\"ts_usec\":1673654629184412}"}},"stream_states":[{"stream_descriptor":{"name":"deployments","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"deployments","cursor_field":[]}},{"stream_descriptor":{"name":"prewarmed_instances","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"prewarmed_instances","cursor_field":[]}},{"stream_descriptor":{"name":"projects","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"projects","cursor_field":[]}},{"stream_descriptor":{"name":"teams","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"teams","cursor_field":[]}}]}}]]] 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):483 - sync summary: { "status" : "completed", "recordsSynced" : 11, "bytesSynced" : 2480, "startTime" : 1673654627539, "endTime" : 1673654630811, "totalStats" : { "bytesEmitted" : 2480, "destinationStateMessagesEmitted" : 1, "destinationWriteEndTime" : 1673654630810, "destinationWriteStartTime" : 1673654627598, "meanSecondsBeforeSourceStateMessageEmitted" : 0, "maxSecondsBeforeSourceStateMessageEmitted" : 0, "recordsEmitted" : 11, "recordsCommitted" : 11, "replicationEndTime" : 1673654630811, "replicationStartTime" : 1673654627539, "sourceReadEndTime" : 1673654630648, "sourceReadStartTime" : 1673654627567, "sourceStateMessagesEmitted" : 1 }, "streamStats" : [ { "streamName" : "deployments", "stats" : { "bytesEmitted" : 1407, "recordsEmitted" : 5, "recordsCommitted" : 5 } }, { "streamName" : "teams", "stats" : { "bytesEmitted" : 167, "recordsEmitted" : 1, "recordsCommitted" : 1 } }, { "streamName" : "projects", "stats" : { "bytesEmitted" : 906, "recordsEmitted" : 5, "recordsCommitted" : 5 } } ] } 2023-01-14 00:03:50 INFO i.a.w.g.DefaultReplicationWorker(getReplicationOutput):484 - failures: [ ] 2023-01-14 00:03:50 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-01-14 00:03:50 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:50 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END REPLICATION ----- 2023-01-14 00:03:50 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-01-14 00:03:50 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):209 - sync summary: io.airbyte.config.StandardSyncOutput@d5d6933[standardSyncSummary=io.airbyte.config.StandardSyncSummary@4a32f30c[status=completed,recordsSynced=11,bytesSynced=2480,startTime=1673654627539,endTime=1673654630811,totalStats=io.airbyte.config.SyncStats@3abfab5d[bytesEmitted=2480,destinationStateMessagesEmitted=1,destinationWriteEndTime=1673654630810,destinationWriteStartTime=1673654627598,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBeforeSourceStateMessageEmitted=0,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=11,recordsCommitted=11,replicationEndTime=1673654630811,replicationStartTime=1673654627539,sourceReadEndTime=1673654630648,sourceReadStartTime=1673654627567,sourceStateMessagesEmitted=1,additionalProperties={}],streamStats=[io.airbyte.config.StreamSyncStats@44f1b06d[streamName=deployments,streamNamespace=,stats=io.airbyte.config.SyncStats@22035d31[bytesEmitted=1407,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=5,recordsCommitted=5,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}]], io.airbyte.config.StreamSyncStats@7f7470ee[streamName=teams,streamNamespace=,stats=io.airbyte.config.SyncStats@38d6d91e[bytesEmitted=167,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=1,recordsCommitted=1,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}]], io.airbyte.config.StreamSyncStats@716d74b1[streamName=projects,streamNamespace=,stats=io.airbyte.config.SyncStats@493f5578[bytesEmitted=906,destinationStateMessagesEmitted=,destinationWriteEndTime=,destinationWriteStartTime=,estimatedBytes=,estimatedRecords=,meanSecondsBeforeSourceStateMessageEmitted=,maxSecondsBeforeSourceStateMessageEmitted=,maxSecondsBetweenStateMessageEmittedandCommitted=,meanSecondsBetweenStateMessageEmittedandCommitted=,recordsEmitted=5,recordsCommitted=5,replicationEndTime=,replicationStartTime=,sourceReadEndTime=,sourceReadStartTime=,sourceStateMessagesEmitted=,additionalProperties={}]]]],normalizationSummary=,webhookOperationSummary=,state=io.airbyte.config.State@75bf9dae[state=[{"type":"GLOBAL","global_":{"shared_state":{"state":{"{\"schema\":null,\"payload\":[\"convex\",{\"server\":\"convex\"}]}":"{\"transaction_id\":null,\"lsn\":24248789040,\"txId\":111644811,\"ts_usec\":1673654629184412}"}},"stream_states":[{"stream_descriptor":{"name":"deployments","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"deployments","cursor_field":[]}},{"stream_descriptor":{"name":"prewarmed_instances","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"prewarmed_instances","cursor_field":[]}},{"stream_descriptor":{"name":"projects","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"projects","cursor_field":[]}},{"stream_descriptor":{"name":"teams","namespace":"public"},"stream_state":{"stream_namespace":"public","stream_name":"teams","cursor_field":[]}}]}}]],outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@1510afb1[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@1b808a24[stream=io.airbyte.protocol.models.AirbyteStream@1d2bd19[name=deployments,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"dtype":{"type":"string"},"state":{"type":"string"},"creator":{"type":"number","airbyte_type":"integer"},"project_id":{"type":"number","airbyte_type":"integer"},"_ab_cdc_lsn":{"type":"number"},"creation_ts":{"type":"string","format":"date-time","airbyte_type":"timestamp_without_timezone"},"instance_name":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"},"need_backend_info_refresh":{"type":"boolean"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@32f5acce[stream=io.airbyte.protocol.models.AirbyteStream@2288e006[name=teams,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"slug":{"type":"string"},"creator":{"type":"number","airbyte_type":"integer"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@3d469fd9[stream=io.airbyte.protocol.models.AirbyteStream@2968d71d[name=projects,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"slug":{"type":"string"},"deleted":{"type":"boolean"},"team_id":{"type":"number","airbyte_type":"integer"},"_ab_cdc_lsn":{"type":"number"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@36649d22[stream=io.airbyte.protocol.models.AirbyteStream@5389d338[name=prewarmed_instances,jsonSchema={"type":"object","properties":{"id":{"type":"number","airbyte_type":"integer"},"name":{"type":"string"},"version":{"type":"string"},"db_cluster":{"type":"string"},"_ab_cdc_lsn":{"type":"number"},"db_password":{"type":"string"},"instance_secret":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[]] 2023-01-14 00:03:50 INFO i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$3):214 - Sync summary length: 7288 2023-01-14 00:03:50 INFO i.a.c.t.TemporalUtils(withBackgroundHeartbeat):283 - Stopping temporal heartbeating... 2023-01-14 00:03:50 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to get state 2023-01-14 00:03:50 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to create or update state 2023-01-14 00:03:50 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 0 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]} 2023-01-14 00:03:59 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 1 to create or update state 2023-01-14 00:03:59 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 1 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]} 2023-01-14 00:04:09 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 2 to create or update state 2023-01-14 00:04:09 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):177 - Attempt 2 to create or update state error: io.airbyte.api.client.invoker.generated.ApiException: createOrUpdateState call failed with: 500 - {"message":"Internal Server Error: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","exceptionClassName":"java.lang.NullPointerException","exceptionStack":["java.lang.NullPointerException: Cannot invoke \"io.airbyte.protocol.models.AirbyteStateMessage.getGlobal()\" because the return value of \"io.airbyte.config.StateWrapper.getGlobal()\" is null","\tat io.airbyte.config.persistence.StatePersistence.lambda$updateOrCreateState$1(StatePersistence.java:103)","\tat io.airbyte.db.Database.lambda$transaction$0(Database.java:27)","\tat org.jooq.impl.DefaultDSLContext.lambda$transactionResult0$3(DefaultDSLContext.java:549)","\tat org.jooq.impl.Tools$4$1.block(Tools.java:5282)","\tat java.base/java.util.concurrent.ForkJoinPool.unmanagedBlock(ForkJoinPool.java:3744)","\tat java.base/java.util.concurrent.ForkJoinPool.managedBlock(ForkJoinPool.java:3689)","\tat org.jooq.impl.Tools$4.get(Tools.java:5279)","\tat org.jooq.impl.DefaultDSLContext.transactionResult0(DefaultDSLContext.java:597)","\tat org.jooq.impl.DefaultDSLContext.transactionResult(DefaultDSLContext.java:521)","\tat io.airbyte.db.Database.transaction(Database.java:27)","\tat io.airbyte.db.ExceptionWrappingDatabase.transaction(ExceptionWrappingDatabase.java:31)","\tat io.airbyte.config.persistence.StatePersistence.updateOrCreateState(StatePersistence.java:98)","\tat io.airbyte.server.handlers.StateHandler.createOrUpdateState(StateHandler.java:37)","\tat io.airbyte.server.apis.StateApiController.lambda$createOrUpdateState$0(StateApiController.java:30)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:18)","\tat io.airbyte.server.apis.StateApiController.createOrUpdateState(StateApiController.java:30)","\tat io.airbyte.server.apis.$StateApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:378)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.Flux.subscribe(Flux.java:8660)","\tat reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:426)","\tat io.micronaut.reactive.reactor.instrument.ReactorSubscriber.onNext(ReactorSubscriber.java:57)","\tat reactor.core.publisher.MonoCreate$DefaultMonoSink.success(MonoCreate.java:172)","\tat io.micronaut.http.server.netty.RoutingInBoundHandler$4.doOnComplete(RoutingInBoundHandler.java:965)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor$1.doOnComplete(JsonContentProcessor.java:136)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat java.base/java.util.Optional.ifPresent(Optional.java:178)","\tat io.micronaut.core.async.processor.SingleThreadedBufferingProcessor.doOnComplete(SingleThreadedBufferingProcessor.java:48)","\tat io.micronaut.jackson.core.parser.JacksonCoreProcessor.doOnComplete(JacksonCoreProcessor.java:94)","\tat io.micronaut.core.async.subscriber.SingleThreadedBufferingSubscriber.onComplete(SingleThreadedBufferingSubscriber.java:71)","\tat io.micronaut.http.server.netty.jackson.JsonContentProcessor.doOnComplete(JsonContentProcessor.java:161)","\tat io.micronaut.core.async.subscriber.CompletionAwareSubscriber.onComplete(CompletionAwareSubscriber.java:79)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessage(HandlerPublisher.java:383)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.flushBuffer(HandlerPublisher.java:470)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.publishMessageLater(HandlerPublisher.java:360)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.complete(HandlerPublisher.java:423)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.handlerRemoved(HandlerPublisher.java:418)","\tat io.netty.channel.AbstractChannelHandlerContext.callHandlerRemoved(AbstractChannelHandlerContext.java:1122)","\tat io.netty.channel.DefaultChannelPipeline.callHandlerRemoved0(DefaultChannelPipeline.java:637)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:477)","\tat io.netty.channel.DefaultChannelPipeline.remove(DefaultChannelPipeline.java:423)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.removeHandlerIfActive(HttpStreamsHandler.java:483)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.handleReadHttpContent(HttpStreamsHandler.java:319)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler.channelRead(HttpStreamsHandler.java:282)","\tat io.micronaut.http.netty.stream.HttpStreamsServerHandler.channelRead(HttpStreamsServerHandler.java:134)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.websocketx.extensions.WebSocketServerExtensionHandler.channelRead(WebSocketServerExtensionHandler.java:99)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)","\tat io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:93)","\tat io.netty.handler.codec.http.HttpServerKeepAliveHandler.channelRead(HttpServerKeepAliveHandler.java:64)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)","\tat io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)","\tat io.netty.handler.flow.FlowControlHandler.dequeue(FlowControlHandler.java:200)","\tat io.netty.handler.flow.FlowControlHandler.read(FlowControlHandler.java:139)","\tat io.netty.channel.AbstractChannelHandlerContext.invokeRead(AbstractChannelHandlerContext.java:837)","\tat io.netty.channel.AbstractChannelHandlerContext.read(AbstractChannelHandlerContext.java:814)","\tat io.micronaut.http.netty.reactive.HandlerPublisher.requestDemand(HandlerPublisher.java:165)","\tat io.micronaut.http.netty.stream.HttpStreamsHandler$2.requestDemand(HttpStreamsHandler.java:273)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.receivedDemand(HandlerPublisher.java:556)","\tat io.micronaut.http.netty.reactive.HandlerPublisher$ChannelSubscription.lambda$request$0(HandlerPublisher.java:494)","\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)","\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)","\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)","\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)","\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)","\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)","\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]}