Skip to content

Commit

Permalink
Merge pull request #170 from osopardo1/update-version-0.3.2
Browse files Browse the repository at this point in the history
Update version to 0.3.2
  • Loading branch information
osopardo1 authored Mar 14, 2023
2 parents d773e1d + b0b5024 commit 9c15d17
Show file tree
Hide file tree
Showing 5 changed files with 8 additions and 8 deletions.
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,7 +125,7 @@ For example:
sbt assembly

$SPARK_HOME/bin/spark-shell \
--jars ./target/scala-2.12/qbeast-spark-assembly-0.3.1.jar \
--jars ./target/scala-2.12/qbeast-spark-assembly-0.3.2.jar \
--conf spark.sql.extensions=io.qbeast.spark.internal.QbeastSparkSessionExtension \
--packages io.delta:delta-core_2.12:1.2.0
```
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ export SPARK_HOME=$PWD/spark-3.1.1-bin-hadoop3.2
$SPARK_HOME/bin/spark-shell \
--conf spark.sql.extensions=io.qbeast.spark.internal.QbeastSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=io.qbeast.spark.internal.sources.catalog.QbeastCatalog \
--packages io.qbeast:qbeast-spark_2.12:0.3.1,io.delta:delta-core_2.12:1.2.0
--packages io.qbeast:qbeast-spark_2.12:0.3.2,io.delta:delta-core_2.12:1.2.0
```

### 2. Indexing a dataset
Expand Down Expand Up @@ -180,7 +180,7 @@ Use [Python index visualizer](./utils/visualizer/README.md) for your indexed tab
|---------|:-----:|:------:|:----------:|
| 0.1.0 | 3.0.0 | 3.2.0 | 0.8.0 |
| 0.2.0 | 3.1.x | 3.2.0 | 1.0.0 |
| 0.3.1 | 3.2.x | 3.3.x | 1.2.x |
| 0.3.x | 3.2.x | 3.3.x | 1.2.x |

Check [here](https://docs.delta.io/latest/releases.html) for **Delta Lake** and **Apache Spark** version compatibility.

Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import Dependencies._
import xerial.sbt.Sonatype._

val mainVersion = "0.3.1"
val mainVersion = "0.3.2"

lazy val qbeastCore = (project in file("core"))
.settings(
Expand Down
6 changes: 3 additions & 3 deletions docs/CloudStorages.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ Amazon Web Services S3 does not work with Hadoop 2.7. For this provider you'll n
$SPARK_HOME/bin/spark-shell \
--conf spark.sql.extensions=io.qbeast.spark.internal.QbeastSparkSessionExtension \
--conf spark.hadoop.fs.s3a.aws.credentials.provider=org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider \
--packages io.qbeast:qbeast-spark_2.12:0.3.1,\
--packages io.qbeast:qbeast-spark_2.12:0.3.2,\
io.delta:delta-core_2.12:1.2.0,\
com.amazonaws:aws-java-sdk:1.12.20,\
org.apache.hadoop:hadoop-common:3.2.0,\
Expand All @@ -46,7 +46,7 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.hadoop.fs.s3a.access.key=${AWS_ACCESS_KEY_ID} \
--conf spark.hadoop.fs.s3a.secret.key=${AWS_SECRET_ACCESS_KEY} \
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
--packages io.qbeast:qbeast-spark_2.12:0.3.1,\
--packages io.qbeast:qbeast-spark_2.12:0.3.2,\
io.delta:delta-core_2.12:1.2.0,\
org.apache.hadoop:hadoop-common:3.2.0,\
org.apache.hadoop:hadoop-client:3.2.0,\
Expand All @@ -63,7 +63,7 @@ $SPARK_HOME/bin/spark-shell \
--conf spark.hadoop.fs.azure.account.key.blobqsql.blob.core.windows.net="${AZURE_BLOB_STORAGE_KEY}" \
--conf spark.hadoop.fs.AbstractFileSystem.wasb.impl=org.apache.hadoop.fs.azure.Wasb \
--conf spark.sql.extensions=io.qbeast.spark.internal.QbeastSparkSessionExtension \
--packages io.qbeast:qbeast-spark_2.12:0.3.1,\
--packages io.qbeast:qbeast-spark_2.12:0.3.2,\
io.delta:delta-core_2.12:1.2.0,\
org.apache.hadoop:hadoop-azure:3.2.0
```
2 changes: 1 addition & 1 deletion docs/Quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Inside the project folder, launch a spark-shell with the required **dependencies
$SPARK_HOME/bin/spark-shell \
--conf spark.sql.extensions=io.qbeast.spark.internal.QbeastSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=io.qbeast.spark.internal.sources.catalog.QbeastCatalog \
--packages io.qbeast:qbeast-spark_2.12:0.3.1,io.delta:delta-core_2.12:1.2.0
--packages io.qbeast:qbeast-spark_2.12:0.3.2,io.delta:delta-core_2.12:1.2.0
```
As an **_extra configuration_**, you can also change two global parameters of the index:

Expand Down

0 comments on commit 9c15d17

Please sign in to comment.