Skip to content

Commit

Permalink
[KYUUBI #4218] Using DB and table name when checking Delta table schema.
Browse files Browse the repository at this point in the history
### _Why are the changes needed?_

To close #4218 .
This change ensures BI tools can list columns on Delta Lake tables in all schemas.

<img width="312" alt="image" src="https://user-images.githubusercontent.com/89149767/215793967-722eb5f9-ffe4-4ffb-b7f9-1ded06c146d7.png">

<img width="725" alt="image" src="https://user-images.githubusercontent.com/89149767/215794036-871f005f-1494-487d-90aa-1f99891177c2.png">

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [x] Add screenshots for manual tests if appropriate

- [ ] [Run test](https://kyuubi.readthedocs.io/en/master/develop_tools/testing.html#running-tests) locally before make a pull request

Closes #4219 from nousot-cloud-guy/feature/delta-db-schema.

Closes #4218

5698432 [Alex Wiss-Wolferding] Reversing match order in getColumnsByCatalog.
a6d973a [Alex Wiss-Wolferding] Revert "[KYUUBI #1458] Delta lake table columns won't show up in DBeaver."
20337dc [Alex Wiss-Wolferding] Revert "Using DB and table name when checking Delta table schema."
f7e4675 [Alex Wiss-Wolferding] Using DB and table name when checking Delta table schema.

Authored-by: Alex Wiss-Wolferding <alex@nousot.com>
Signed-off-by: Cheng Pan <chengpan@apache.org>
  • Loading branch information
nousot-cloud-guy authored and pan3793 committed Feb 6, 2023
1 parent 8a99750 commit 2b958c6
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 15 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -139,13 +139,7 @@ class CatalogShim_v2_4 extends SparkCatalogShim {
databases.flatMap { db =>
val identifiers = catalog.listTables(db, tablePattern, includeLocalTempViews = true)
catalog.getTablesByName(identifiers).flatMap { t =>
val tableSchema =
if (t.provider.getOrElse("").equalsIgnoreCase("delta")) {
spark.table(t.identifier.table).schema
} else {
t.schema
}
tableSchema.zipWithIndex.filter(f => columnPattern.matcher(f._1.name).matches())
t.schema.zipWithIndex.filter(f => columnPattern.matcher(f._1.name).matches())
.map { case (f, i) => toColumnResult(catalogName, t.database, t.identifier.table, f, i) }
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,14 +188,6 @@ class CatalogShim_v3_0 extends CatalogShim_v2_4 {
val catalog = getCatalog(spark, catalogName)

catalog match {
case builtin if builtin.name() == SESSION_CATALOG =>
super.getColumnsByCatalog(
spark,
SESSION_CATALOG,
schemaPattern,
tablePattern,
columnPattern)

case tc: TableCatalog =>
val namespaces = listNamespacesWithPattern(catalog, schemaPattern)
val tp = tablePattern.r.pattern
Expand All @@ -210,6 +202,14 @@ class CatalogShim_v3_0 extends CatalogShim_v2_4 {
table.schema.zipWithIndex.filter(f => columnPattern.matcher(f._1.name).matches())
.map { case (f, i) => toColumnResult(tc.name(), namespace, tableName, f, i) }
}

case builtin if builtin.name() == SESSION_CATALOG =>
super.getColumnsByCatalog(
spark,
SESSION_CATALOG,
schemaPattern,
tablePattern,
columnPattern)
}
}
}

0 comments on commit 2b958c6

Please sign in to comment.