Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Namespace fields in catalog #1993

Closed
wants to merge 7 commits into from
Closed

Conversation

ChristopheDuong
Copy link
Contributor

@ChristopheDuong ChristopheDuong commented Feb 8, 2021

What

Describe what the change is solving
Implements #1921
(blocked/depends on #1934)

How

Describe the solution
Add namespace field to Catalog (optional) and ConfiguredCatalog (mandatory) to specify where destination should write the final table.

Pre-merge Checklist

  • Run integration tests
  • Publish Docker images

Recommended reading order

  1. airbyte-api/src/main/openapi/config.yaml
  2. airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_protocol.yaml
  3. airbyte-server/src/main/java/io/airbyte/server/converters/CatalogConverter.java
  4. the rest

@ChristopheDuong ChristopheDuong changed the base branch from master to chris/api_catalog February 8, 2021 13:56
@ChristopheDuong ChristopheDuong mentioned this pull request Feb 8, 2021
2 tasks
@ChristopheDuong
Copy link
Contributor Author

ChristopheDuong commented Feb 8, 2021

With this PR:
There is a slight change in the Airbyte Protocol, should I bump the versions of all connectors to 0.2.0?

As a result of this protocol change, It seems the code base of core server & destination connectors using this new protocol are not backward compatible anymore with source connectors 0.1.XX...

  1. Do we need to publish a migration script that upgrades all versions of 0.1.XX connectors to at least 0.2.0? (at least for the destinations ones)
  2. Have a version check in the server to warn if any non-compatible sources are still configured and should be updated to at least 0.2.0?
  3. Iterate further and make this PR backward compatible with the old protocol somehow?
    a. Migrate (downgrade) the new AirbyteCatalog protocol at runtime to the old version so it can be compatible and passed to the source in 0.1.XX?
    b. Separate ConfiguredCatalog in two distinct parts, one for source and one for destination and only concerned part is forwarded to the appropriate connector by the SyncWorker
    c. Another idea?

Example of logs if I try to setup

  • a 0.1.XX source
  • with my new code base for server and destination in dev / 0.2.0 versions:
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) WorkerRun(call):58 - Executing worker wrapper...
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) DefaultSyncWorker(run):82 - configured sync modes: {public.id_and_name=full_refresh}
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) DefaultAirbyteDestination(start):67 - Running target...
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - Checking if airbyte/destination-postgres:dev exists...
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - airbyte/destination-postgres:dev was found locally.
2021-02-08 18:35:30 DEBUG (/tmp/workspace/8/0) DockerProcessBuilderFactory(create):104 - Preparing command: docker run --rm -i -v airbyte_workspace_dev:/data -v /tmp/airbyte_local_dev:/local -w /data/8/0 --network host airbyte/destination-postgres:dev write --config target_config.json --catalog catalog.json
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - Checking if airbyte/source-postgres:0.1.13 exists...
2021-02-08 18:35:30 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - airbyte/source-postgres:0.1.13 was found locally.
2021-02-08 18:35:30 DEBUG (/tmp/workspace/8/0) DockerProcessBuilderFactory(create):104 - Preparing command: docker run --rm -i -v airbyte_workspace_dev:/data -v /tmp/airbyte_local_dev:/local -w /data/8/0 --network host airbyte/source-postgres:0.1.13 read --config tap_config.json --catalog catalog.json
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.d.p.PostgresDestination(main):68 - {} - starting destination: class io.airbyte.integrations.destination.postgres.PostgresDestination
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):78 - {} - Running integration: io.airbyte.integrations.destination.postgres.PostgresDestination
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.b.IntegrationCliParser(parseOptions):135 - {} - integration args: {catalog=catalog.json, write=null, config=target_config.json}
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):82 - {} - Command: WRITE
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):83 - {} - Integration config: IntegrationConfig{command=WRITE, configPath='target_config.json', catalogPath='catalog.json', statePath='null'}
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.s.p.PostgresSource(main):71 - {} - starting source: class io.airbyte.integrations.source.postgres.PostgresSource
2021-02-08 18:35:31 INFO (/tmp/workspace/8/0) DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-02-08 18:35:31 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):78 - {} - Running integration: io.airbyte.integrations.source.postgres.PostgresSource
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.b.IntegrationCliParser(parseOptions):135 - {} - integration args: {read=null, catalog=catalog.json, config=tap_config.json}
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):82 - {} - Command: READ
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):83 - {} - Integration config: IntegrationConfig{command=READ, configPath='tap_config.json', catalogPath='catalog.json', statePath='null'}
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(startTracked):118 - {} - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(startTracked):120 - {} - Buffer creation started for 1 streams.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(startTracked):123 - {} - Buffer creation for stream public.id_and_name.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(startTracked):127 - {} - Buffer creation completed.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):90 - {} - Preparing tmp tables in destination started for 1 streams
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):94 - {} - Preparing tmp table in destination started for stream public.id_and_name. schema _airbyte_acceptancetestdb_2a0, tmp table name: _tmp_h7TF1612809332290_public_id_and_name
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - Exception in thread "main" java.lang.IllegalArgumentException: Unrecognized field "alias_name" (class io.airbyte.protocol.models.ConfiguredAirbyteStream), not marked as ignorable (3 known properties: "stream", "cursor_field", "sync_mode"])
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 -  at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: io.airbyte.protocol.models.ConfiguredAirbyteCatalog["streams"]->java.util.ArrayList[0]->io.airbyte.protocol.models.ConfiguredAirbyteStream["alias_name"])
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3938)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:3869)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at io.airbyte.commons.json.Jsons.object(Jsons.java:100)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at io.airbyte.integrations.base.IntegrationRunner.parseConfig(IntegrationRunner.java:146)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:101)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at io.airbyte.integrations.source.postgres.PostgresSource.main(PostgresSource.java:72)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - Caused by: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "alias_name" (class io.airbyte.protocol.models.ConfiguredAirbyteStream), not marked as ignorable (3 known properties: "stream", "cursor_field", "sync_mode"])
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 -  at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: io.airbyte.protocol.models.ConfiguredAirbyteCatalog["streams"]->java.util.ArrayList[0]->io.airbyte.protocol.models.ConfiguredAirbyteStream["alias_name"])
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:843)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1206)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1610)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownVanilla(BeanDeserializerBase.java:1588)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:294)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:286)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:27)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:129)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:288)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:151)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	at com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:3933)
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 	... 5 more
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onStartFunction$1):100 - {} - Preparing tables in destination completed.
2021-02-08 18:35:32 DEBUG (/tmp/workspace/8/0) DefaultAirbyteSource(close):109 - Closing tap process
2021-02-08 18:35:32 DEBUG (/tmp/workspace/8/0) DefaultAirbyteDestination(close):102 - Closing target process
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.b.FailureTrackingConsumer(close):64 - {} - hasFailed: false.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(close):164 - {} - executing on success close procedure.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):127 - {} - Finalizing tables in destination started for 1 streams
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):132 - {} - Finalizing stream public.id_and_name. schema _airbyte_acceptancetestdb_2a0, tmp table _tmp_h7TF1612809332290_public_id_and_name, final table public_id_and_name
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):144 - {} - Executing finalization of tables.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):146 - {} - Finalizing tables in destination completed.
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):149 - {} - Cleaning tmp tables in destination started for 1 streams
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):153 - {} - Cleaning tmp table in destination started for stream public.id_and_name. schema _airbyte_acceptancetestdb_2a0, tmp table name: _tmp_h7TF1612809332290_public_id_and_name
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.j.JdbcBufferedConsumerFactory(lambda$onCloseFunction$3):159 - {} - Cleaning tmp tables in destination completed.
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - WARNING: An illegal reflective access operation has occurred
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - WARNING: Illegal reflective access by com.leansoft.bigqueue.page.MappedPageImpl$Cleaner (file:/airbyte/lib/leansoft-bigqueue-0.7.3.jar) to method java.nio.DirectByteBuffer.cleaner()
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - WARNING: Please consider reporting this to the maintainers of com.leansoft.bigqueue.page.MappedPageImpl$Cleaner
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) LineGobbler(voidCall):69 - WARNING: All illegal access operations will be denied in a future release
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):120 - {} - Completed integration: io.airbyte.integrations.destination.postgres.PostgresDestination
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LineGobbler(voidCall):69 - 2021-02-08 18:35:32 �[32mINFO�[m i.a.i.d.p.PostgresDestination(main):70 - {} - completed destination: class io.airbyte.integrations.destination.postgres.PostgresDestination
2021-02-08 18:35:32 ERROR (/tmp/workspace/8/0) DefaultSyncWorker(run):104 - Sync worker failed.
io.airbyte.workers.WorkerException: Tap process wasn't successful
	at io.airbyte.workers.protocols.airbyte.DefaultAirbyteSource.close(DefaultAirbyteSource.java:112) ~[io.airbyte-airbyte-workers-0.14.1-alpha.jar:?]
	at io.airbyte.workers.DefaultSyncWorker.run(DefaultSyncWorker.java:103) [io.airbyte-airbyte-workers-0.14.1-alpha.jar:?]
	at io.airbyte.workers.DefaultSyncWorker.run(DefaultSyncWorker.java:48) [io.airbyte-airbyte-workers-0.14.1-alpha.jar:?]
	at io.airbyte.workers.wrappers.OutputConvertingWorker.run(OutputConvertingWorker.java:44) [io.airbyte-airbyte-workers-0.14.1-alpha.jar:?]
	at io.airbyte.workers.wrappers.JobOutputSyncWorker.run(JobOutputSyncWorker.java:32) [io.airbyte-airbyte-workers-0.14.1-alpha.jar:?]
	at io.airbyte.scheduler.WorkerRun.lambda$new$0(WorkerRun.java:53) [io.airbyte-airbyte-scheduler-0.14.1-alpha.jar:?]
	at io.airbyte.scheduler.WorkerRun.call(WorkerRun.java:61) [io.airbyte-airbyte-scheduler-0.14.1-alpha.jar:?]
	at io.airbyte.scheduler.WorkerRun.call(WorkerRun.java:42) [io.airbyte-airbyte-scheduler-0.14.1-alpha.jar:?]
	at io.airbyte.commons.concurrency.LifecycledCallable.execute(LifecycledCallable.java:114) [io.airbyte-airbyte-commons-0.14.1-alpha.jar:?]
	at io.airbyte.commons.concurrency.LifecycledCallable.call(LifecycledCallable.java:98) [io.airbyte-airbyte-commons-0.14.1-alpha.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
	at java.lang.Thread.run(Thread.java:832) [?:?]
2021-02-08 18:35:32 INFO (/tmp/workspace/8/0) LoggingTrackingClient(track):55 - track. userId: 7823c4bb-a250-40c2-9b9b-2a19c77ddf43 action: Connector Jobs, metadata: {job_type=sync, job_id=8, attempt_id=[], job_uuid=a09a43ff-bd25-38bd-b2cf-328edeb912c8, attempt_uuid=a09a43ff-bd25-38bd-b2cf-328edeb912c8, connection_id=1b51f2a2-a4d1-42d3-ad52-4052041b4e9f, connector_source=Postgres, connector_source_definition_id=decd338e-5647-4c0b-adf4-da0e75f5a750, connector_destination=Postgres, connector_destination_definition_id=25c5221d-dce2-4163-ade9-739ef790f503, frequency=manual, attempt_stage=ENDED, attempt_completion_status=FAILED}

It seems the catalog.json passed to the source with the "old protocol" when reading during sync is rejected because of the new fields used by destinations

@@ -171,6 +176,14 @@ definitions:
type: array
items:
type: string
alias_name:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why did you decide to drop the destination and go for alias & target? I have a preference for the destination prefix

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No particular reason, I reflected on it and thought it would make it clearer what it is used for afterward... but we can always revert to destination_name / destination_namespace instead If you think it's better

@@ -144,6 +144,9 @@ definitions:
type: array
items:
type: string
namespace:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you move it close to name since these two work together?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we should create an object that contains both name & namespace that we can use for both the stream and the configured stream.

WDYT?

StreamName {
   string: name
   string: namespace
}

Base automatically changed from chris/api_catalog to master February 15, 2021 21:40
@swyxio swyxio deleted the chris/namespace-catalog branch October 12, 2022 18:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants