Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discover worker starts to use API to write schema result #21875

Merged
merged 45 commits into from
Feb 6, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
8fb6b7a
api changes for writing discover catalog
xiaohansong Jan 17, 2023
56606b5
Merge remote-tracking branch 'origin/master' into xiaohan/discover
xiaohansong Jan 17, 2023
bfdaef3
api changes
xiaohansong Jan 18, 2023
a7cf331
format
xiaohansong Jan 18, 2023
e11c17b
worker change 1
xiaohansong Jan 18, 2023
21ec7f1
change return type of the API to return catalogId
xiaohansong Jan 18, 2023
37080ff
worker to call api
xiaohansong Jan 18, 2023
8f5ec25
Merge remote-tracking branch 'origin/master' into xiaohan/disworker
xiaohansong Jan 18, 2023
3b8e233
Merge remote-tracking branch 'origin/master' into xiaohan/disworker
xiaohansong Jan 18, 2023
8700b1e
typo
xiaohansong Jan 18, 2023
26a7b10
:tada: Source GoogleSheets - migrated SAT to strictness level (#21399)
midavadim Jan 18, 2023
a85aef7
🐛Destination-Bigquery: Added an explicit error message if sync fails …
etsybaev Jan 18, 2023
ed2780c
ci-connector-ops: split workflows(#21474)
alafanechere Jan 18, 2023
d4556f1
CI: nightly build alpha sources and destinations (#21562)
alafanechere Jan 18, 2023
d54c842
Revert "Change main class in strict-encrypt destination and bump vers…
grishick Jan 19, 2023
fbf6501
Fixes webhook updating logic (#21519)
YatsukBogdan1 Jan 19, 2023
35c83ff
ci_credentials: disable tooling test run by tox (#21580)
alafanechere Jan 19, 2023
f0aebd8
Revert "CI: nightly build alpha sources and destinations (#21562)" (…
alafanechere Jan 19, 2023
a2b6932
Security update of default docker images (#21407)
vr Jan 19, 2023
6c46b65
📝 add docs for how to add normalization (#21563)
pedroslopez Jan 19, 2023
a2fab11
🪟 🚦 E2E tests: clean up matchers (#20887)
dizel852 Jan 19, 2023
3c3b935
🪟 🎨 [Free connectors] Update modal copy (#21600)
josephkmh Jan 19, 2023
37ebd3a
move start/end time options out of optional block (#21541)
lmossman Jan 19, 2023
05988b8
lingering fix
xiaohansong Jan 19, 2023
66ffbfd
reflecting api changes
xiaohansong Jan 20, 2023
3456a19
test fix
xiaohansong Jan 20, 2023
73bc4cc
Merge remote-tracking branch 'origin/master' into xiaohan/disworker
xiaohansong Jan 20, 2023
6d577c8
Merge remote-tracking branch 'origin/master' into xiaohan/disworker
xiaohansong Jan 25, 2023
cbeefb9
worker to call api to do discover work
xiaohansong Jan 25, 2023
544ebd3
recovered deleted html
xiaohansong Jan 25, 2023
9e5b359
self review
xiaohansong Jan 25, 2023
d2e47a7
more converters refactor
xiaohansong Jan 25, 2023
971982f
fix connector test
xiaohansong Jan 25, 2023
e877991
fix test
xiaohansong Jan 25, 2023
56e383b
fix
xiaohansong Jan 26, 2023
db559a2
fix integration test
xiaohansong Jan 26, 2023
c858453
Merge branch 'master' into xiaohan/disworker
xiaohansong Jan 26, 2023
94296aa
Merge branch 'master' into xiaohan/disworker
xiaohansong Jan 26, 2023
0036de8
add unit test for converter
xiaohansong Jan 27, 2023
a4296c8
static fix
xiaohansong Jan 27, 2023
d998cc2
Merge branch 'master' into xiaohan/disworker
xiaohansong Jan 27, 2023
c6aec13
Merge branch 'master' into xiaohan/disworker
xiaohansong Jan 30, 2023
2851158
Merge remote-tracking branch 'origin/master' into xiaohan/disworker
xiaohansong Jan 30, 2023
036db6c
api client needs to have a timeout in case request does not get respo…
xiaohansong Jan 31, 2023
947baeb
Merge branch 'master' into xiaohan/disworker
xiaohansong Feb 6, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
import java.net.http.HttpClient;
import java.net.http.HttpClient.Version;
import java.security.interfaces.RSAPrivateKey;
import java.time.Duration;
import java.util.Date;
import java.util.concurrent.TimeUnit;
import lombok.extern.slf4j.Slf4j;
Expand All @@ -53,6 +54,8 @@ public ApiClient apiClient(
.setPort(parsePort(airbyteApiHost))
.setBasePath("/api")
.setHttpClientBuilder(HttpClient.newBuilder().version(Version.HTTP_1_1))
.setConnectTimeout(Duration.ofSeconds(30))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can we define these as 'CONNECT_TIMEOUT_DURATIONandREAD_TIMEOUT_DURATION` constants?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't see this comment - I made the fix in the next PR

.setReadTimeout(Duration.ofSeconds(30))
.setRequestInterceptor(builder -> {
builder.setHeader("User-Agent", "WorkerApp");
// internalApiAuthToken is in BeanProvider because we want to create a new token each
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,15 @@

import com.fasterxml.jackson.databind.JsonNode;
import datadog.trace.api.Trace;
import io.airbyte.api.client.AirbyteApiClient;
import io.airbyte.api.client.model.generated.DiscoverCatalogResult;
import io.airbyte.api.client.model.generated.SourceDiscoverSchemaWriteRequestBody;
import io.airbyte.commons.io.LineGobbler;
import io.airbyte.commons.json.Jsons;
import io.airbyte.config.ConnectorJobOutput;
import io.airbyte.config.ConnectorJobOutput.OutputType;
import io.airbyte.config.FailureReason;
import io.airbyte.config.StandardDiscoverCatalogInput;
import io.airbyte.config.persistence.ConfigRepository;
import io.airbyte.metrics.lib.ApmTraceUtils;
import io.airbyte.protocol.models.AirbyteCatalog;
import io.airbyte.protocol.models.AirbyteControlConnectorConfigMessage;
Expand All @@ -26,6 +28,7 @@
import io.airbyte.workers.WorkerConstants;
import io.airbyte.workers.WorkerUtils;
import io.airbyte.workers.exception.WorkerException;
import io.airbyte.workers.helper.CatalogClientConverters;
import io.airbyte.workers.helper.ConnectorConfigUpdater;
import io.airbyte.workers.internal.AirbyteStreamFactory;
import io.airbyte.workers.internal.DefaultAirbyteStreamFactory;
Expand All @@ -43,29 +46,28 @@
public class DefaultDiscoverCatalogWorker implements DiscoverCatalogWorker {

private static final Logger LOGGER = LoggerFactory.getLogger(DefaultDiscoverCatalogWorker.class);

private final ConfigRepository configRepository;
private static final String WRITE_DISCOVER_CATALOG_LOGS_TAG = "call to write discover schema result";

private final IntegrationLauncher integrationLauncher;
private final AirbyteStreamFactory streamFactory;
private final ConnectorConfigUpdater connectorConfigUpdater;

private final AirbyteApiClient airbyteApiClient;
private volatile Process process;

public DefaultDiscoverCatalogWorker(final ConfigRepository configRepository,
public DefaultDiscoverCatalogWorker(final AirbyteApiClient airbyteApiClient,
final IntegrationLauncher integrationLauncher,
final ConnectorConfigUpdater connectorConfigUpdater,
final AirbyteStreamFactory streamFactory) {
this.configRepository = configRepository;
this.airbyteApiClient = airbyteApiClient;
this.integrationLauncher = integrationLauncher;
this.streamFactory = streamFactory;
this.connectorConfigUpdater = connectorConfigUpdater;
}

public DefaultDiscoverCatalogWorker(final ConfigRepository configRepository,
public DefaultDiscoverCatalogWorker(final AirbyteApiClient airbyteApiClient,
final IntegrationLauncher integrationLauncher,
final ConnectorConfigUpdater connectorConfigUpdater) {
this(configRepository, integrationLauncher, connectorConfigUpdater, new DefaultAirbyteStreamFactory());
this(airbyteApiClient, integrationLauncher, connectorConfigUpdater, new DefaultAirbyteStreamFactory());
}

@Trace(operationName = WORKER_OPERATION_NAME)
Expand Down Expand Up @@ -108,14 +110,11 @@ public ConnectorJobOutput run(final StandardDiscoverCatalogInput discoverSchemaI
}

if (catalog.isPresent()) {
final UUID catalogId =
configRepository.writeActorCatalogFetchEvent(catalog.get(),
// NOTE: sourceId is marked required in the OpenAPI config but the code generator doesn't enforce
// it, so we check again here.
discoverSchemaInput.getSourceId() == null ? null : UUID.fromString(discoverSchemaInput.getSourceId()),
discoverSchemaInput.getConnectorVersion(),
discoverSchemaInput.getConfigHash());
jobOutput.setDiscoverCatalogId(catalogId);
final DiscoverCatalogResult result =
AirbyteApiClient.retryWithJitter(() -> airbyteApiClient.getSourceApi()
.writeDiscoverCatalogResult(buildSourceDiscoverSchemaWriteRequestBody(discoverSchemaInput, catalog.get())),
WRITE_DISCOVER_CATALOG_LOGS_TAG);
jobOutput.setDiscoverCatalogId(result.getCatalogId());
} else if (failureReason.isEmpty()) {
WorkerUtils.throwWorkerException("Integration failed to output a catalog struct and did not output a failure reason", process);
}
Expand All @@ -129,6 +128,19 @@ public ConnectorJobOutput run(final StandardDiscoverCatalogInput discoverSchemaI
}
}

private SourceDiscoverSchemaWriteRequestBody buildSourceDiscoverSchemaWriteRequestBody(final StandardDiscoverCatalogInput discoverSchemaInput,
final AirbyteCatalog catalog) {
return new SourceDiscoverSchemaWriteRequestBody().catalog(
CatalogClientConverters.toAirbyteCatalogClientApi(catalog)).sourceId(
// NOTE: sourceId is marked required in the OpenAPI config but the code generator doesn't enforce
// it, so we check again here.
discoverSchemaInput.getSourceId() == null ? null : UUID.fromString(discoverSchemaInput.getSourceId()))
.connectorVersion(
discoverSchemaInput.getConnectorVersion())
.configurationHash(
discoverSchemaInput.getConfigHash());
}

private Map<String, Object> generateTraceTags(final StandardDiscoverCatalogInput discoverSchemaInput, final Path jobRoot) {
final Map<String, Object> tags = new HashMap<>();

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
/*
* Copyright (c) 2022 Airbyte, Inc., all rights reserved.
*/

package io.airbyte.workers.helper;

import io.airbyte.commons.enums.Enums;
import io.airbyte.commons.text.Names;
import io.airbyte.protocol.models.AirbyteStream;
import java.util.stream.Collectors;

/**
* Utilities to convert Catalog protocol to Catalog API client. This class was similar to existing
* logic in CatalogConverter.java; But code can't be shared because the protocol model is
* essentially converted to two different api models. Thus, if we need to change logic on either
* place we have to take care of the other one too.
*/
public class CatalogClientConverters {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mfsiega-airbyte curious if you have thoughts here on adding more 'catalog conversion' logic to a new module. These conversion functions are new in that they're converting to the client (ie, 'api.client.model.generated') types as opposed to the api types that the existing CatalogConverter.java handles.

I think the 'airbyte-commons-worker' module probably makes sense for client-specific conversions (since the worker here is after all a client of the API) but I feel like I want to raise a discussion here to make sure we aren't fragmenting messy catalog conversion logic all over the place if we can avoid it. Thoughts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Want to add that I can't add this logic into CatalogConverters because it is defined in airbyte-common-server, which has already use airbyte-common-workers as its dependency; introducing this will cause the dependency loop error.

I was also thinking about moving both code into a common library so airbyte-common-server and airbyte-common-worker can both depend on it, but not sure where to put it; plus seems server side code will never be used by worker and client side code will never be used by server.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think something like airbyte-commons-catalog could make sense. We could even take advantage of a refactor to improve the names (e.g., instead of toApi -> persistenceToApi) and document what all the various conversions means.

Last but not least I'd really like to refactor the way catalogs are stored and handled: (1) we don't have a distinct "persistence" model, but rather re-use the model from the protocol, which is limiting; (2) we have a loose relationship between discovered source schemas and configured connection catalogs which causes no end of confusion.

All in all I do think having this logic fragmented all over is a bit of a problem, but I think I'd say it already is a problem, and I don't think this PR makes it much worse.

Copy link
Contributor Author

@xiaohansong xiaohansong Jan 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i feel this might be out of scope for this PR - I'll create a separate ticket to track this and we probably should discuss refactoring goal/plan in a refining meeting https://github.com/airbytehq/airbyte/issues/21999

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of the above makes sense to me, thanks for creating that followup ticket @xiaohansong !


/**
* Converts a protocol AirbyteCatalog to an OpenAPI client versioned AirbyteCatalog.
*/
public static io.airbyte.api.client.model.generated.AirbyteCatalog toAirbyteCatalogClientApi(
final io.airbyte.protocol.models.AirbyteCatalog catalog) {
return new io.airbyte.api.client.model.generated.AirbyteCatalog()
.streams(catalog.getStreams()
.stream()
.map(stream -> toAirbyteStreamClientApi(stream))
.map(s -> new io.airbyte.api.client.model.generated.AirbyteStreamAndConfiguration()
.stream(s)
.config(generateDefaultConfiguration(s)))
.collect(Collectors.toList()));
}

private static io.airbyte.api.client.model.generated.AirbyteStreamConfiguration generateDefaultConfiguration(
final io.airbyte.api.client.model.generated.AirbyteStream stream) {
final io.airbyte.api.client.model.generated.AirbyteStreamConfiguration result =
new io.airbyte.api.client.model.generated.AirbyteStreamConfiguration()
.aliasName(Names.toAlphanumericAndUnderscore(stream.getName()))
.cursorField(stream.getDefaultCursorField())
.destinationSyncMode(io.airbyte.api.client.model.generated.DestinationSyncMode.APPEND)
.primaryKey(stream.getSourceDefinedPrimaryKey())
.selected(true);
if (stream.getSupportedSyncModes().size() > 0) {
result.setSyncMode(Enums.convertTo(stream.getSupportedSyncModes().get(0),
io.airbyte.api.client.model.generated.SyncMode.class));
} else {
result.setSyncMode(io.airbyte.api.client.model.generated.SyncMode.INCREMENTAL);
}
return result;
}

private static io.airbyte.api.client.model.generated.AirbyteStream toAirbyteStreamClientApi(
final AirbyteStream stream) {
return new io.airbyte.api.client.model.generated.AirbyteStream()
.name(stream.getName())
.jsonSchema(stream.getJsonSchema())
.supportedSyncModes(Enums.convertListTo(stream.getSupportedSyncModes(),
io.airbyte.api.client.model.generated.SyncMode.class))
.sourceDefinedCursor(stream.getSourceDefinedCursor())
.defaultCursorField(stream.getDefaultCursorField())
.sourceDefinedPrimaryKey(stream.getSourceDefinedPrimaryKey())
.namespace(stream.getNamespace());
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
/*
* Copyright (c) 2022 Airbyte, Inc., all rights reserved.
*/

package io.airbyte.workers.helper;

import static org.junit.jupiter.api.Assertions.assertEquals;

import com.google.common.collect.Lists;
import io.airbyte.commons.text.Names;
import io.airbyte.protocol.models.AirbyteCatalog;
import io.airbyte.protocol.models.AirbyteStream;
import io.airbyte.protocol.models.CatalogHelpers;
import io.airbyte.protocol.models.Field;
import io.airbyte.protocol.models.JsonSchemaType;
import io.airbyte.protocol.models.SyncMode;
import java.util.Collections;
import java.util.List;
import org.junit.jupiter.api.Test;

class CatalogClientConvertersTest {

public static final String ID_FIELD_NAME = "id";
private static final String STREAM_NAME = "users-data";
private static final AirbyteStream STREAM = new AirbyteStream()
.withName(STREAM_NAME)
.withJsonSchema(
CatalogHelpers.fieldsToJsonSchema(Field.of(ID_FIELD_NAME, JsonSchemaType.STRING)))
.withDefaultCursorField(Lists.newArrayList(ID_FIELD_NAME))
.withSourceDefinedCursor(false)
.withSourceDefinedPrimaryKey(Collections.emptyList())
.withSupportedSyncModes(List.of(SyncMode.FULL_REFRESH, SyncMode.INCREMENTAL));

private static final io.airbyte.api.client.model.generated.AirbyteStream CLIENT_STREAM =
new io.airbyte.api.client.model.generated.AirbyteStream()
.name(STREAM_NAME)
.jsonSchema(CatalogHelpers.fieldsToJsonSchema(Field.of(ID_FIELD_NAME, JsonSchemaType.STRING)))
.defaultCursorField(Lists.newArrayList(ID_FIELD_NAME))
.sourceDefinedCursor(false)
.sourceDefinedPrimaryKey(Collections.emptyList())
.supportedSyncModes(List.of(io.airbyte.api.client.model.generated.SyncMode.FULL_REFRESH,
io.airbyte.api.client.model.generated.SyncMode.INCREMENTAL));
private static final io.airbyte.api.client.model.generated.AirbyteStreamConfiguration CLIENT_DEFAULT_STREAM_CONFIGURATION =
new io.airbyte.api.client.model.generated.AirbyteStreamConfiguration()
.syncMode(io.airbyte.api.client.model.generated.SyncMode.FULL_REFRESH)
.cursorField(Lists.newArrayList(ID_FIELD_NAME))
.destinationSyncMode(io.airbyte.api.client.model.generated.DestinationSyncMode.APPEND)
.primaryKey(Collections.emptyList())
.aliasName(Names.toAlphanumericAndUnderscore(STREAM_NAME))
.selected(true);

private static final AirbyteCatalog BASIC_MODEL_CATALOG = new AirbyteCatalog().withStreams(
Lists.newArrayList(STREAM));

private static final io.airbyte.api.client.model.generated.AirbyteCatalog EXPECTED_CLIENT_CATALOG =
new io.airbyte.api.client.model.generated.AirbyteCatalog()
.streams(Lists.newArrayList(
new io.airbyte.api.client.model.generated.AirbyteStreamAndConfiguration()
.stream(CLIENT_STREAM)
.config(CLIENT_DEFAULT_STREAM_CONFIGURATION)));

@Test
void testConvertToClientAPI() {
assertEquals(EXPECTED_CLIENT_CATALOG,
CatalogClientConverters.toAirbyteCatalogClientApi(BASIC_MODEL_CATALOG));
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import org.jsoup.Jsoup;

dependencies {
implementation project(':airbyte-db:db-lib')
implementation project(':airbyte-api')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this be a testImplementation rather than implementation? Is it only a dependency because it is used in the test?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes exactly... removed

implementation project(':airbyte-commons-worker')
implementation project(':airbyte-config:config-models')
implementation project(':airbyte-config:config-persistence')
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,14 @@

import static org.junit.jupiter.api.Assertions.assertFalse;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;

import com.fasterxml.jackson.databind.JsonNode;
import io.airbyte.api.client.AirbyteApiClient;
import io.airbyte.api.client.generated.SourceApi;
import io.airbyte.api.client.model.generated.SourceDiscoverSchemaWriteRequestBody;
import io.airbyte.commons.features.EnvVariableFeatureFlags;
import io.airbyte.commons.json.Jsons;
import io.airbyte.config.EnvConfigs;
Expand All @@ -21,7 +24,6 @@
import io.airbyte.config.StandardDiscoverCatalogInput;
import io.airbyte.config.State;
import io.airbyte.config.WorkerSourceConfig;
import io.airbyte.config.persistence.ConfigRepository;
import io.airbyte.protocol.models.v0.AirbyteCatalog;
import io.airbyte.protocol.models.v0.AirbyteMessage;
import io.airbyte.protocol.models.v0.AirbyteMessage.Type;
Expand Down Expand Up @@ -112,7 +114,10 @@ public abstract class AbstractSourceConnectorTest {

private WorkerConfigs workerConfigs;

private ConfigRepository mConfigRepository;
private AirbyteApiClient mAirbyteApiClient;

private SourceApi mSourceApi;

private ConnectorConfigUpdater mConnectorConfigUpdater;

// This has to be using the protocol version of the platform in order to capture the arg
Expand All @@ -123,6 +128,9 @@ protected AirbyteCatalog getLastPersistedCatalog() {
return convertProtocolObject(lastPersistedCatalog.getValue(), AirbyteCatalog.class);
}

private final ArgumentCaptor<SourceDiscoverSchemaWriteRequestBody> discoverWriteRequest =
ArgumentCaptor.forClass(SourceDiscoverSchemaWriteRequestBody.class);

@BeforeEach
public void setUpInternal() throws Exception {
final Path testDir = Path.of("/tmp/airbyte_tests/");
Expand All @@ -133,7 +141,9 @@ public void setUpInternal() throws Exception {
environment = new TestDestinationEnv(localRoot);
setupEnvironment(environment);
workerConfigs = new WorkerConfigs(new EnvConfigs());
mConfigRepository = mock(ConfigRepository.class);
mAirbyteApiClient = mock(AirbyteApiClient.class);
mSourceApi = mock(SourceApi.class);
when(mAirbyteApiClient.getSourceApi()).thenReturn(mSourceApi);
mConnectorConfigUpdater = mock(ConnectorConfigUpdater.class);
processFactory = new DockerProcessFactory(
workerConfigs,
Expand Down Expand Up @@ -182,13 +192,13 @@ protected String runCheckAndGetStatusAsString(final JsonNode config) throws Exce

protected UUID runDiscover() throws Exception {
final UUID toReturn = new DefaultDiscoverCatalogWorker(
mConfigRepository,
mAirbyteApiClient,
new AirbyteIntegrationLauncher(JOB_ID, JOB_ATTEMPT, getImageName(), processFactory, workerConfigs.getResourceRequirements(), null, false,
new EnvVariableFeatureFlags()),
mConnectorConfigUpdater)
.run(new StandardDiscoverCatalogInput().withSourceId(SOURCE_ID.toString()).withConnectionConfiguration(getConfig()), jobRoot)
.getDiscoverCatalogId();
verify(mConfigRepository).writeActorCatalogFetchEvent(lastPersistedCatalog.capture(), any(), any(), any());
verify(mSourceApi).writeDiscoverCatalogResult(discoverWriteRequest.capture());
return toReturn;
}

Expand Down
1 change: 1 addition & 0 deletions airbyte-workers/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,7 @@ dependencies {

testImplementation project(':airbyte-commons-docker')
testImplementation project(':airbyte-test-utils')
testImplementation project(':airbyte-api')

integrationTestJavaImplementation project(':airbyte-workers')
integrationTestJavaImplementation libs.bundles.micronaut.test
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@ private CheckedSupplier<Worker<StandardDiscoverCatalogInput, ConnectorJobOutput>
Optional.empty());
final ConnectorConfigUpdater connectorConfigUpdater =
new ConnectorConfigUpdater(airbyteApiClient.getSourceApi(), airbyteApiClient.getDestinationApi());
return new DefaultDiscoverCatalogWorker(configRepository, integrationLauncher, connectorConfigUpdater, streamFactory);
return new DefaultDiscoverCatalogWorker(airbyteApiClient, integrationLauncher, connectorConfigUpdater, streamFactory);
};
}

Expand Down
Loading