Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Persistence Component - Big query support #1904

Merged
merged 55 commits into from
Jul 4, 2023
Merged
Show file tree
Hide file tree
Changes from 20 commits
Commits
Show all changes
55 commits
Select commit Hold shift + click to select a range
110a3ee
Initial Code for big query module
prasar-ashutosh May 22, 2023
c732cc4
Added different Visitors for BigQuerySink
prasar-ashutosh May 26, 2023
a459ab5
Added BigQuery Data type mappings
shubham43 May 29, 2023
2ad0541
Add different tests for BigQuery
prasar-ashutosh May 30, 2023
dd14366
Added transactional sql executor for Big Query via Java API
shubham43 Jun 1, 2023
12b309e
Add support for ParseTime , Fix tests, format code and add e2e Tests
prasar-ashutosh Jun 5, 2023
a50db2b
Rebase with Master and fix the issues
prasar-ashutosh Jun 5, 2023
35c1955
Dedicated Visitor for Derived Dataset to staging filtering for all sc…
prasar-ashutosh Jun 7, 2023
c9ca920
Fix imports in the component module
shubham43 Jun 7, 2023
5c9979d
Fix execute statements for DDL statements
shubham43 Jun 7, 2023
18c2ef4
Moved Connection inside relational sink for executor
shubham43 Jun 7, 2023
d5ca20a
Fix Big Query Sink
prasar-ashutosh Jun 8, 2023
21f3dd8
Bump version for big-query module pom
prasar-ashutosh Jun 8, 2023
508370f
Moved Connection inside relational sinks
shubham43 Jun 8, 2023
6a23b0a
Clean up interfaces to use Wrapper of Connection
prasar-ashutosh Jun 8, 2023
6320261
Format all Sink Classes
prasar-ashutosh Jun 8, 2023
57ecb51
Initial Code for big query module
prasar-ashutosh May 22, 2023
c90acfa
Added different Visitors for BigQuerySink
prasar-ashutosh May 26, 2023
bcd4c60
Added BigQuery Data type mappings
shubham43 May 29, 2023
31e6d5b
Add different tests for BigQuery
prasar-ashutosh May 30, 2023
d2c13f1
Added transactional sql executor for Big Query via Java API
shubham43 Jun 1, 2023
28da86e
Add support for ParseTime , Fix tests, format code and add e2e Tests
prasar-ashutosh Jun 5, 2023
e68b762
Rebase with Master and fix the issues
prasar-ashutosh Jun 5, 2023
9345f66
Dedicated Visitor for Derived Dataset to staging filtering for all sc…
prasar-ashutosh Jun 7, 2023
68637bd
Fix imports in the component module
shubham43 Jun 7, 2023
ebf95c3
Fix execute statements for DDL statements
shubham43 Jun 7, 2023
1610805
Moved Connection inside relational sink for executor
shubham43 Jun 7, 2023
2782f10
Fix Big Query Sink
prasar-ashutosh Jun 8, 2023
54a2b34
Bump version for big-query module pom
prasar-ashutosh Jun 8, 2023
0ffd1a3
Moved Connection inside relational sinks
shubham43 Jun 8, 2023
5354b3b
Clean up interfaces to use Wrapper of Connection
prasar-ashutosh Jun 8, 2023
aea7b4d
Format all Sink Classes
prasar-ashutosh Jun 8, 2023
86eb3a7
Executor Tests for Big Query and bug Fix for Nontemporal Delta - upda…
prasar-ashutosh Jun 12, 2023
97d436b
Rebase with Master
prasar-ashutosh Jun 13, 2023
f3e3bec
Refactor and clean up code for Big Query Executor
prasar-ashutosh Jun 13, 2023
2486ae1
Add Alter Visitor for Big Query to provide support for Schema Evolution
prasar-ashutosh Jun 14, 2023
873b6ea
Add primary keys fetch and validation
shubham43 Jun 17, 2023
bae9c95
Add schema evolution tests and bitemporal delta end-to-end test
shubham43 Jun 28, 2023
b928328
Merge remote-tracking branch 'origin/big_query_support' into big_quer…
shubham43 Jun 28, 2023
0a2f9b0
Update alter table command after rebase
shubham43 Jun 28, 2023
a4f312c
Revert unintentional changes from rebase
shubham43 Jun 28, 2023
a3c43f3
Added Bitemporal Delta end-to-end test
shubham43 Jun 30, 2023
22aa57f
Fix insert sql test
shubham43 Jun 30, 2023
f831778
Reduced entries in implicit data type mapping, and renamed evolveFiel…
shubham43 Jun 30, 2023
462f444
Removed finished todos
shubham43 Jun 30, 2023
ce674f4
Bump version of parent in Big Query pom
prasar-ashutosh Jun 30, 2023
c7bb28e
Add documentation for getting service account credential path
shubham43 Jun 30, 2023
aa58fad
Merge remote-tracking branch 'origin/big_query_support' into big_quer…
shubham43 Jun 30, 2023
def4c96
Fixed tests post schema column rename
shubham43 Jun 30, 2023
e20c620
Merge branch 'finos:master' into big_query_support
prasar-ashutosh Jun 30, 2023
314a5af
Merge branch 'finos:master' into big_query_support
shubham43 Jun 30, 2023
4996f9d
Create logical plan to fetch primary keys in Big Query
shubham43 Jul 3, 2023
70eba5e
Merge remote-tracking branch 'origin/big_query_support' into big_quer…
shubham43 Jul 3, 2023
6372057
Upgrade pom to latest
shubham43 Jul 3, 2023
ae4b796
Handle code review comments
prasar-ashutosh Jul 4, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,12 @@
import org.finos.legend.engine.persistence.components.ingestmode.validitymilestoning.derivation.SourceSpecifiesFromAndThruDateTimeAbstract;
import org.finos.legend.engine.persistence.components.ingestmode.validitymilestoning.derivation.SourceSpecifiesFromDateTimeAbstract;
import org.finos.legend.engine.persistence.components.ingestmode.validitymilestoning.derivation.ValidityDerivationVisitor;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.*;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.DataType;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.Dataset;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.DatasetDefinition;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.Field;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.FieldType;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.SchemaDefinition;

import java.util.ArrayList;
import java.util.List;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@
import java.util.List;
import java.util.stream.Collectors;

import static org.finos.legend.engine.persistence.components.util.LogicalPlanUtils.*;
import static org.finos.legend.engine.persistence.components.util.LogicalPlanUtils.SUPPORTED_DATA_TYPES_FOR_OPTIMIZATION_COLUMNS;
import static org.finos.legend.engine.persistence.components.util.LogicalPlanUtils.findCommonPrimaryFieldsBetweenMainAndStaging;

public class IngestModeOptimizationColumnHandler implements IngestModeVisitor<IngestMode>
{
Expand Down Expand Up @@ -92,9 +93,9 @@ private List<OptimizationFilter> deriveOptimizationFilters(UnitemporalDeltaAbstr
List<Field> primaryKeys = findCommonPrimaryFieldsBetweenMainAndStaging(datasets.mainDataset(), datasets.stagingDataset());
List<Field> comparablePrimaryKeys = primaryKeys.stream().filter(field -> SUPPORTED_DATA_TYPES_FOR_OPTIMIZATION_COLUMNS.contains(field.type().dataType())).collect(Collectors.toList());
optimizationFilters = new ArrayList<>();
for (Field field: comparablePrimaryKeys)
for (Field field : comparablePrimaryKeys)
{
OptimizationFilter filter = OptimizationFilter.of(field.name(),field.name().toUpperCase() + "_LOWER", field.name().toUpperCase() + "_UPPER");
OptimizationFilter filter = OptimizationFilter.of(field.name(), field.name().toUpperCase() + "_LOWER", field.name().toUpperCase() + "_UPPER");
optimizationFilters.add(filter);
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,39 +18,37 @@
import org.finos.legend.engine.persistence.components.logicalplan.conditions.Equals;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.Dataset;
import org.finos.legend.engine.persistence.components.logicalplan.datasets.Selection;
import org.finos.legend.engine.persistence.components.logicalplan.values.*;
import org.finos.legend.engine.persistence.components.util.LogicalPlanUtils;
import org.finos.legend.engine.persistence.components.logicalplan.values.FieldValue;
import org.finos.legend.engine.persistence.components.logicalplan.values.FunctionImpl;
import org.finos.legend.engine.persistence.components.logicalplan.values.FunctionName;
import org.finos.legend.engine.persistence.components.logicalplan.values.ObjectValue;
import org.finos.legend.engine.persistence.components.logicalplan.values.Order;
import org.finos.legend.engine.persistence.components.logicalplan.values.OrderedField;
import org.finos.legend.engine.persistence.components.logicalplan.values.Value;
import org.finos.legend.engine.persistence.components.logicalplan.values.WindowFunction;

import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;

public class DatasetFilterAndDeduplicator implements VersioningStrategyVisitor<Dataset>
public class DatasetDeduplicator implements VersioningStrategyVisitor<Dataset>
{

Dataset stagingDataset;
List<String> primaryKeys;
Optional<Condition> stagingDatasetFilter;

private static final String ROW_NUMBER = "legend_persistence_row_num";

public DatasetFilterAndDeduplicator(Dataset stagingDataset, List<String> primaryKeys)
public DatasetDeduplicator(Dataset stagingDataset, List<String> primaryKeys)
{
this.stagingDataset = stagingDataset;
this.primaryKeys = primaryKeys;
this.stagingDatasetFilter = LogicalPlanUtils.getDatasetFilterCondition(stagingDataset);
}

@Override
public Dataset visitNoVersioningStrategy(NoVersioningStrategyAbstract noVersioningStrategy)
{
Dataset enrichedStagingDataset = this.stagingDataset;
if (this.stagingDatasetFilter.isPresent())
{
enrichedStagingDataset = filterDataset();
}
return enrichedStagingDataset;
return this.stagingDataset;
}

@Override
Expand Down Expand Up @@ -78,7 +76,6 @@ public Dataset visitMaxVersionStrategy(MaxVersionStrategyAbstract maxVersionStra
Selection selectionWithRowNumber = Selection.builder()
.source(stagingDataset)
.addAllFields(allColumnsWithRowNumber)
.condition(stagingDatasetFilter)
.alias(stagingDataset.datasetReference().alias())
.build();

Expand All @@ -91,22 +88,6 @@ public Dataset visitMaxVersionStrategy(MaxVersionStrategyAbstract maxVersionStra
.alias(stagingDataset.datasetReference().alias())
.build();
}
else if (this.stagingDatasetFilter.isPresent())
{
enrichedStagingDataset = filterDataset();
}
return enrichedStagingDataset;
}

private Dataset filterDataset()
{
List<Value> allColumns = new ArrayList<>(stagingDataset.schemaReference().fieldValues());
Selection selection = Selection.builder()
.source(this.stagingDataset)
.addAllFields(allColumns)
.condition(this.stagingDatasetFilter.get())
.alias(stagingDataset.datasetReference().alias())
.build();
return selection;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -105,17 +105,10 @@ public static LogicalPlan getLogicalPlanForNextBatchId(Datasets datasets)
public static LogicalPlan getLogicalPlanForMinAndMaxForField(Dataset dataset, String fieldName)
{
FieldValue field = FieldValue.builder().datasetRef(dataset.datasetReference()).fieldName(fieldName).build();
Selection.Builder selectionBuilder = Selection.builder()
Selection selection = Selection.builder()
.addFields(FunctionImpl.builder().functionName(FunctionName.MIN).addValue(field).alias(MIN_OF_FIELD).build())
.addFields(FunctionImpl.builder().functionName(FunctionName.MAX).addValue(field).alias(MAX_OF_FIELD).build())
.source(dataset);

Optional<Condition> filterCondition = LogicalPlanUtils.getDatasetFilterCondition(dataset);
if (filterCondition.isPresent())
{
selectionBuilder = selectionBuilder.condition(filterCondition);
}

return LogicalPlan.builder().addOps(selectionBuilder.build()).build();
.source(dataset).build();
return LogicalPlan.builder().addOps(selection).build();
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,8 @@ public enum DataType
LONGNVARCHAR,
UNDEFINED,
INT64,
FLOAT64,
BYTES,
STRING,
BOOL,
LONGTEXT,
Expand All @@ -67,13 +69,13 @@ public enum DataType

public static boolean isStringDatatype(DataType type)
{
List<DataType> stringDatatype = new ArrayList<DataType>(Arrays.asList(CHAR, CHARACTER, VARCHAR, LONGVARCHAR, NCHAR, NVARCHAR, LONGNVARCHAR, LONGTEXT, TEXT, JSON, STRING));
List<DataType> stringDatatype = new ArrayList<>(Arrays.asList(CHAR, CHARACTER, VARCHAR, LONGVARCHAR, NCHAR, NVARCHAR, LONGNVARCHAR, LONGTEXT, TEXT, JSON, STRING));
return stringDatatype.contains(type);
}

public static Set<DataType> getComparableDataTypes()
{
return new HashSet<>(Arrays.asList(INT, INTEGER, BIGINT, TINYINT, SMALLINT, INT64, REAL, DECIMAL, FLOAT, DOUBLE, NUMBER, NUMERIC,
return new HashSet<>(Arrays.asList(INT, INTEGER, BIGINT, TINYINT, SMALLINT, INT64, FLOAT64, REAL, DECIMAL, FLOAT, DOUBLE, NUMBER, NUMERIC,
TIME, TIMESTAMP, TIMESTAMP_NTZ, TIMESTAMP_TZ, TIMESTAMP_LTZ, DATETIME, TIMESTAMPTZ, DATE));
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
// Copyright 2023 Goldman Sachs
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.

package org.finos.legend.engine.persistence.components.logicalplan.datasets;

import org.finos.legend.engine.persistence.components.logicalplan.LogicalPlanNode;
import org.finos.legend.engine.persistence.components.logicalplan.values.Value;
import org.immutables.value.Value.Immutable;
import org.immutables.value.Value.Parameter;
import org.immutables.value.Value.Style;

@Immutable
@Style(
typeAbstract = "*Abstract",
typeImmutable = "*",
jdkOnly = true,
optionalAcceptNullable = true,
strictBuilder = true
)
public interface PartitionKeyAbstract extends LogicalPlanNode
{
@Parameter(order = 0)
Value key();
}
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ public interface SchemaDefinitionAbstract extends Schema

List<ClusterKey> clusterKeys();

List<PartitionKey> partitionKeys();

Optional<ColumnStoreSpecification> columnStoreSpecification();

Optional<ShardSpecification> shardSpecification();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,5 +28,12 @@ public enum FunctionName
UPPER,
ROW_NUMBER,
SUBSTRING,
PARSE_JSON;
PARSE_JSON,
DATE,
DATE_TRUNC,
DATETIME_TRUNC,
TIMESTAMP_TRUNC,
RANGE_BUCKET,
GENERATE_ARRAY,
PARSE_DATETIME;
}
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
import org.finos.legend.engine.persistence.components.common.StatisticName;
import org.finos.legend.engine.persistence.components.ingestmode.NontemporalDelta;
import org.finos.legend.engine.persistence.components.ingestmode.audit.AuditingVisitors;
import org.finos.legend.engine.persistence.components.ingestmode.deduplication.DatasetFilterAndDeduplicator;
import org.finos.legend.engine.persistence.components.ingestmode.deduplication.DatasetDeduplicator;
import org.finos.legend.engine.persistence.components.ingestmode.deduplication.VersioningConditionVisitor;
import org.finos.legend.engine.persistence.components.ingestmode.merge.MergeStrategyVisitors;
import org.finos.legend.engine.persistence.components.logicalplan.LogicalPlan;
Expand Down Expand Up @@ -93,7 +93,7 @@ class NontemporalDeltaPlanner extends Planner

// Perform Deduplication & Filtering of Staging Dataset
this.enrichedStagingDataset = ingestMode().versioningStrategy()
.accept(new DatasetFilterAndDeduplicator(stagingDataset(), primaryKeys));
.accept(new DatasetDeduplicator(stagingDataset(), primaryKeys));
}

@Override
Expand Down Expand Up @@ -194,6 +194,12 @@ private Merge getMergeOperation()
versioningCondition = this.versioningCondition;
}

if (ingestMode().auditing().accept(AUDIT_ENABLED))
{
String auditField = ingestMode().auditing().accept(AuditingVisitors.EXTRACT_AUDIT_FIELD).orElseThrow(IllegalStateException::new);
keyValuePairs.add(Pair.of(FieldValue.builder().datasetRef(mainDataset().datasetReference()).fieldName(auditField).build(), batchStartTimestamp));
}

Merge merge = Merge.builder()
.dataset(mainDataset())
.usingDataset(stagingDataset)
Expand All @@ -203,13 +209,6 @@ private Merge getMergeOperation()
.matchedCondition(versioningCondition)
.build();

if (ingestMode().auditing().accept(AUDIT_ENABLED))
{
String auditField = ingestMode().auditing().accept(AuditingVisitors.EXTRACT_AUDIT_FIELD).orElseThrow(IllegalStateException::new);
keyValuePairs.add(Pair.of(FieldValue.builder().datasetRef(mainDataset().datasetReference()).fieldName(auditField).build(), batchStartTimestamp));
merge = merge.withUnmatchedKeyValuePairs(keyValuePairs);
}

return merge;
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ protected void addPreRunStatsForRowsDeleted(Map<StatisticName, LogicalPlan> preR

protected void addPostRunStatsForIncomingRecords(Map<StatisticName, LogicalPlan> postRunStatisticsResult)
{
Optional<Condition> filterCondition = LogicalPlanUtils.getDatasetFilterCondition(stagingDataset());
Optional<Condition> filterCondition = Optional.empty();
if (dataSplitExecutionSupported())
{
Optional<Condition> dataSplitInRangeCondition = getDataSplitInRangeConditionForStatistics();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
import org.finos.legend.engine.persistence.components.common.Resources;
import org.finos.legend.engine.persistence.components.common.StatisticName;
import org.finos.legend.engine.persistence.components.ingestmode.UnitemporalDelta;
import org.finos.legend.engine.persistence.components.ingestmode.deduplication.DatasetFilterAndDeduplicator;
import org.finos.legend.engine.persistence.components.ingestmode.deduplication.DatasetDeduplicator;
import org.finos.legend.engine.persistence.components.ingestmode.deduplication.VersioningConditionVisitor;
import org.finos.legend.engine.persistence.components.ingestmode.merge.MergeStrategyVisitors;
import org.finos.legend.engine.persistence.components.logicalplan.LogicalPlan;
Expand Down Expand Up @@ -84,7 +84,7 @@ class UnitemporalDeltaPlanner extends UnitemporalPlanner
this.dataSplitInRangeCondition = ingestMode.dataSplitField().map(field -> LogicalPlanUtils.getDataSplitInRangeCondition(stagingDataset(), field));
// Perform Deduplication & Filtering of Staging Dataset
this.enrichedStagingDataset = ingestMode().versioningStrategy()
.accept(new DatasetFilterAndDeduplicator(stagingDataset(), primaryKeys));
.accept(new DatasetDeduplicator(stagingDataset(), primaryKeys));
this.versioningCondition = ingestMode().versioningStrategy()
.accept(new VersioningConditionVisitor(mainDataset(), stagingDataset(), false, ingestMode().digestField()));
this.inverseVersioningCondition = ingestMode.versioningStrategy()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -244,21 +244,15 @@ public static void replaceField(List<Value> fieldsList, String oldFieldName, Str
});
}

public static Optional<Condition> getDatasetFilterCondition(Dataset dataSet)
public static Condition getDatasetFilterCondition(DerivedDataset derivedDataset)
{
Optional<Condition> filter = Optional.empty();
if (dataSet instanceof DerivedDataset)
List<DatasetFilter> datasetFilters = derivedDataset.datasetFilters();
List<Condition> conditions = new ArrayList<>();
for (DatasetFilter datasetFilter: datasetFilters)
{
DerivedDataset derivedDataset = (DerivedDataset) dataSet;
List<DatasetFilter> datasetFilters = derivedDataset.datasetFilters();
List<Condition> conditions = new ArrayList<>();
for (DatasetFilter datasetFilter: datasetFilters)
{
conditions.add(datasetFilter.mapFilterToCondition(dataSet.datasetReference()));
}
filter = Optional.of(And.of(conditions));
conditions.add(datasetFilter.mapFilterToCondition(derivedDataset.datasetReference()));
}
return filter;
return And.of(conditions);
}

public static List<DatasetFilter> getDatasetFilters(Dataset dataSet)
Expand Down Expand Up @@ -320,7 +314,7 @@ public static Selection getRecordCount(Dataset dataset, String alias)
public static Selection getRecordCount(Dataset dataset, String alias, Optional<Condition> condition)
{
return Selection.builder()
.source(dataset.datasetReference())
.source(dataset)
.addFields(FunctionImpl.builder().functionName(FunctionName.COUNT).alias(alias).addValue(All.INSTANCE).build())
.condition(condition)
.build();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@

package org.finos.legend.engine.persistence.components.relational.ansi;

import org.finos.legend.engine.persistence.components.executor.Executor;
import org.finos.legend.engine.persistence.components.logicalplan.conditions.And;
import org.finos.legend.engine.persistence.components.logicalplan.conditions.Equals;
import org.finos.legend.engine.persistence.components.logicalplan.conditions.Exists;
Expand Down Expand Up @@ -77,6 +78,7 @@
import org.finos.legend.engine.persistence.components.optimizer.Optimizer;
import org.finos.legend.engine.persistence.components.relational.CaseConversion;
import org.finos.legend.engine.persistence.components.relational.RelationalSink;
import org.finos.legend.engine.persistence.components.relational.SqlPlan;
import org.finos.legend.engine.persistence.components.relational.ansi.optimizer.LowerCaseOptimizer;
import org.finos.legend.engine.persistence.components.relational.ansi.optimizer.UpperCaseOptimizer;
import org.finos.legend.engine.persistence.components.relational.ansi.sql.visitors.AllQuantifierVisitor;
Expand Down Expand Up @@ -135,6 +137,9 @@
import org.finos.legend.engine.persistence.components.relational.ansi.sql.visitors.TableConstraintVisitor;
import org.finos.legend.engine.persistence.components.relational.ansi.sql.visitors.WindowFunctionVisitor;
import org.finos.legend.engine.persistence.components.relational.ansi.sql.visitors.ParseJsonFunctionVisitor;
import org.finos.legend.engine.persistence.components.relational.api.RelationalConnection;
import org.finos.legend.engine.persistence.components.relational.sql.TabularData;
import org.finos.legend.engine.persistence.components.relational.sqldom.SqlGen;
import org.finos.legend.engine.persistence.components.relational.sqldom.utils.SqlGenUtils;
import org.finos.legend.engine.persistence.components.transformer.LogicalPlanVisitor;
import org.finos.legend.engine.persistence.components.util.Capability;
Expand Down Expand Up @@ -286,6 +291,12 @@ public Optional<Optimizer> optimizerForCaseConversion(CaseConversion caseConvers
}
}

@Override
public Executor<SqlGen, TabularData, SqlPlan> getRelationalExecutor(RelationalConnection connection)
{
throw new UnsupportedOperationException("No executor supported for AnsiSql Sink");
}

// utility methods

private static Map<Class<?>, LogicalPlanVisitor<?>> rightBiasedUnion(Map<Class<?>, LogicalPlanVisitor<?>> map1, Map<Class<?>, LogicalPlanVisitor<?>> map2)
Expand Down
Loading