2121
2222The Polaris Spark plugin provides a SparkCatalog class, which communicates with the Polaris
2323REST endpoints, and provides implementations for Apache Spark's
24- [ TableCatalog] ( https://github.com/apache/spark/blob/v3.5.5 /sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java ) ,
25- [ SupportsNamespaces ] ( https://github.com/apache/spark/blob/v3.5.5 /sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsNamespaces .java ) ,
26- [ ViewCatalog ] ( https://github.com/apache/spark/blob/v3.5.5 /sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/ViewCatalog .java ) classes.
24+ [ TableCatalog] ( https://github.com/apache/spark/blob/v3.5.6 /sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java ) ,
25+ [ ViewCatalog ] ( https://github.com/apache/spark/blob/v3.5.6 /sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/ViewCatalog .java ) classes.
26+ [ SupportsNamespaces ] ( https://github.com/apache/spark/blob/v3.5.6 /sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsNamespaces .java ) ,
2727
2828Right now, the plugin only provides support for Spark 3.5, Scala version 2.12 and 2.13,
29- and depends on iceberg-spark-runtime 1.9.0 .
29+ and depends on iceberg-spark-runtime 1.9.1 .
3030
3131# Start Spark with local Polaris service using the Polaris Spark plugin
3232The following command starts a Polaris server for local testing, it runs on localhost:8181 with default
@@ -50,7 +50,7 @@ Run the following command to build the Polaris Spark project and publish the sou
5050
5151``` shell
5252bin/spark-shell \
53- --packages org.apache.polaris:polaris-spark-< spark_version> _< scala_version> :< polaris_version> ,org.apache.iceberg:iceberg-aws-bundle:1.9.0 ,io.delta:delta-spark_2.12:3.3.1 \
53+ --packages org.apache.polaris:polaris-spark-< spark_version> _< scala_version> :< polaris_version> ,org.apache.iceberg:iceberg-aws-bundle:1.9.1 ,io.delta:delta-spark_2.12:3.3.1 \
5454--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
5555--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
5656--conf spark.sql.catalog.< catalog-name> .warehouse=< catalog-name> \
@@ -73,7 +73,7 @@ The Spark command would look like following:
7373
7474``` shell
7575bin/spark-shell \
76- --packages org.apache.polaris:polaris-spark-3.5_2.12:1.1.0-incubating-SNAPSHOT,org.apache.iceberg:iceberg-aws-bundle:1.9.0 ,io.delta:delta-spark_2.12:3.3.1 \
76+ --packages org.apache.polaris:polaris-spark-3.5_2.12:1.1.0-incubating-SNAPSHOT,org.apache.iceberg:iceberg-aws-bundle:1.9.1 ,io.delta:delta-spark_2.12:3.3.1 \
7777--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
7878--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
7979--conf spark.sql.catalog.polaris.warehouse=polaris \
@@ -99,7 +99,7 @@ To start Spark using the bundle JAR, specify it with the `--jars` option as show
9999``` shell
100100bin/spark-shell \
101101--jars < path-to-spark-client-jar> \
102- --packages org.apache.iceberg:iceberg-aws-bundle:1.9.0 ,io.delta:delta-spark_2.12:3.3.1 \
102+ --packages org.apache.iceberg:iceberg-aws-bundle:1.9.1 ,io.delta:delta-spark_2.12:3.3.1 \
103103--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
104104--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
105105--conf spark.sql.catalog.< catalog-name> .warehouse=< catalog-name> \
0 commit comments