Skip to content

Commit 134ce54

Browse files
Update based on PR feedback
1 parent cfb3732 commit 134ce54

File tree

5 files changed

+22
-22
lines changed

5 files changed

+22
-22
lines changed

plugins/spark/README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -27,10 +27,10 @@ REST endpoints, and provides implementations for Apache Spark's
2727

2828
Right now, the plugin only provides support for Spark 3.5, Scala version 2.12 and 2.13, and depends on iceberg-spark-runtime 1.9.1.
2929

30-
The Polaris Spark client supports catalog management for both Iceberg and Delta Lake tables. It routes all Iceberg table
31-
requests to the Iceberg REST endpoints and routes all Delta Lake table requests to the Generic Table REST endpoints.
30+
The Polaris Spark client supports catalog management for both Iceberg and Delta tables. It routes all Iceberg table
31+
requests to the Iceberg REST endpoints and routes all Delta table requests to the Generic Table REST endpoints.
3232

33-
The Spark Client requires at least delta 3.2.1 to work with Delta Lake tables, which requires at least Apache Spark 3.5.3.
33+
The Spark Client requires at least delta 3.2.1 to work with Delta tables, which requires at least Apache Spark 3.5.3.
3434

3535
# Start Spark with local Polaris service using the Polaris Spark plugin
3636
The following command starts a Polaris server for local testing, it runs on localhost:8181 with default
@@ -116,16 +116,16 @@ bin/spark-shell \
116116
--conf spark.sql.sources.useV1SourceList=''
117117
```
118118

119-
# Limitations
119+
# Current Limitations
120120
The following describes the current limitations of the Polaris Spark client:
121121

122122
## General Limitations
123-
1. The Polaris Spark client only supports Iceberg and Delta Lake tables. It does not support other table formats like CSV, JSON, etc.
123+
1. The Polaris Spark client only supports Iceberg and Delta tables. It does not support other table formats like CSV, JSON, etc.
124124
2. Generic tables (non-Iceberg tables) do not currently support credential vending.
125125

126-
## Delta Lake Limitations
127-
1. Create table as select (CTAS) is not supported for Delta Lake tables. As a result, the `saveAsTable` method of `Dataframe`
126+
## Delta Table Limitations
127+
1. Create table as select (CTAS) is not supported for Delta tables. As a result, the `saveAsTable` method of `Dataframe`
128128
is also not supported, since it relies on the CTAS support.
129-
2. Create a Delta Lake table without explicit location is not supported.
130-
3. Rename a Delta Lake table is not supported.
129+
2. Create a Delta table without explicit location is not supported.
130+
3. Rename a Delta table is not supported.
131131
4. ALTER TABLE ... SET LOCATION is not supported for DELTA table.

plugins/spark/v3.5/getting-started/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,12 +17,12 @@
1717
under the License.
1818
-->
1919

20-
# Getting Started with Apache Spark and Apache Polaris With Delta Lake and Iceberg
20+
# Getting Started with Apache Spark and Apache Polaris With Delta and Iceberg
2121

2222
This getting started guide provides a `docker-compose` file to set up [Apache Spark](https://spark.apache.org/) with Apache Polaris using
2323
the new Polaris Spark Client.
2424

25-
The Polaris Spark Client enables manage of both Delta Lake and Iceberg tables using Apache Polaris.
25+
The Polaris Spark Client enables manage of both Delta and Iceberg tables using Apache Polaris.
2626

2727
A Jupyter notebook is started to run PySpark, and Polaris Python client is also installed to call Polaris APIs
2828
directly through Python Client.

plugins/spark/v3.5/spark/src/main/java/org/apache/polaris/spark/utils/DeltaHelper.java

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,12 +29,12 @@
2929
import org.slf4j.LoggerFactory;
3030

3131
/**
32-
* Helper class for integrating Delta Lake table functionality with Polaris Spark Catalog.
32+
* Helper class for integrating Delta table functionality with Polaris Spark Catalog.
3333
*
3434
* <p>This class is responsible for dynamically loading and configuring a Delta Catalog
3535
* implementation to work with Polaris. It sets up the Delta Catalog as a delegating catalog
36-
* extension with Polaris Spark Catalog as the delegate, enabling Delta Lake table operations
37-
* through Polaris.
36+
* extension with Polaris Spark Catalog as the delegate, enabling Delta table operations through
37+
* Polaris.
3838
*
3939
* <p>The class uses reflection to configure the Delta Catalog to behave identically to Unity
4040
* Catalog, as the current Delta Catalog implementation is hardcoded for Unity Catalog. This is a

site/content/in-dev/unreleased/generic-table.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ type: docs
2222
weight: 435
2323
---
2424

25-
The generic tables framework provides support for non-Iceberg table formats including Delta Lake, CSV, etc. With this framework, you can:
25+
The generic tables are non-Iceberg tables. Table can be multiple formats including Delta, CSV, etc. With this framework, you can:
2626
- Create a generic table under a namespace
2727
- Load a generic table
2828
- Drop a generic table
@@ -162,6 +162,6 @@ For the complete and up-to-date API specification, see the [Catalog API Spec](ht
162162

163163
There are some known limitations for the generic table support:
164164
1. Generic tables provide limited spec information. For example, there is no spec for Schema or Partition.
165-
2. There is no commit coordination provided by Polaris. It is the responsibility of the engine to coordinate commits.
165+
2. There is no commit coordination provided by Polaris. It is the responsibility of the engine to coordinate loading and committing data. The catalog is only aware of the generic table fields above.
166166
3. There is no update capability provided by Polaris. Any update to a generic table must be done through a drop and create.
167167
4. Generic tables do not support credential vending.

site/content/in-dev/unreleased/polaris-spark-client.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -119,16 +119,16 @@ If you would like to use a version of the Spark client that is currently not yet
119119
build a Spark client jar locally from source. Please check out the Polaris repo and refer to the Spark plugin
120120
[README](https://github.com/apache/polaris/blob/main/plugins/spark/README.md) for detailed instructions.
121121

122-
## Limitations
122+
## Known Limitations
123123
The following describes the current limitations of the Polaris Spark client:
124124

125125
### General Limitations
126-
1. The Polaris Spark client only supports Iceberg and Delta Lake tables. It does not support other table formats like CSV, JSON, etc.
126+
1. The Polaris Spark client only supports Iceberg and Delta tables. It does not support other table formats like CSV, JSON, etc.
127127
2. Generic tables (non-Iceberg tables) do not currently support credential vending.
128128

129-
### Delta Lake Limitations
130-
1. Create table as select (CTAS) is not supported for Delta Lake tables. As a result, the `saveAsTable` method of `Dataframe`
129+
### Delta Table Limitations
130+
1. Create table as select (CTAS) is not supported for Delta tables. As a result, the `saveAsTable` method of `Dataframe`
131131
is also not supported, since it relies on the CTAS support.
132-
2. Create a Delta Lake table without explicit location is not supported.
133-
3. Rename a Delta Lake table is not supported.
132+
2. Create a Delta table without explicit location is not supported.
133+
3. Rename a Delta table is not supported.
134134
4. ALTER TABLE ... SET LOCATION is not supported for DELTA table.

0 commit comments

Comments
 (0)