Skip to content

Commit

Permalink
Update version of udf-examples and pca-examples to 22.08.0-SNAPSHOT (#…
Browse files Browse the repository at this point in the history
…183)

Signed-off-by: Tim Liu <timl@nvidia.com>
  • Loading branch information
NvTimLiu authored Jun 24, 2022
1 parent bd3a103 commit b44cfac
Show file tree
Hide file tree
Showing 8 changed files with 10 additions and 9 deletions.
2 changes: 1 addition & 1 deletion examples/ML+DL-Examples/Spark-cuML/pca/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<groupId>com.nvidia</groupId>
<artifactId>PCAExample</artifactId>
<packaging>jar</packaging>
<version>22.06.0-SNAPSHOT</version>
<version>22.08.0-SNAPSHOT</version>

<properties>
<maven.compiler.source>8</maven.compiler.source>
Expand Down
4 changes: 2 additions & 2 deletions examples/ML+DL-Examples/Spark-cuML/pca/spark-submit.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
#

ML_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark-ml_2.12/22.06.0-SNAPSHOT/rapids-4-spark-ml_2.12-22.06.0-SNAPSHOT.jar
PLUGIN_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark_2.12/22.06.0-SNAPSHOT/rapids-4-spark_2.12-22.06.0-SNAPSHOT.jar
PLUGIN_JAR=/root/.m2/repository/com/nvidia/rapids-4-spark_2.12/22.08.0-SNAPSHOT/rapids-4-spark_2.12-22.08.0-SNAPSHOT.jar

$SPARK_HOME/bin/spark-submit \
--master spark://127.0.0.1:7077 \
Expand All @@ -38,4 +38,4 @@ $SPARK_HOME/bin/spark-submit \
--conf spark.network.timeout=1000s \
--jars $ML_JAR,$PLUGIN_JAR \
--class com.nvidia.spark.examples.pca.Main \
/workspace/target/PCAExample-22.06.0-SNAPSHOT.jar
/workspace/target/PCAExample-22.08.0-SNAPSHOT.jar
2 changes: 1 addition & 1 deletion examples/UDF-Examples/RAPIDS-accelerated-UDFs/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
user defined functions for use with the RAPIDS Accelerator
for Apache Spark
</description>
<version>22.06.0-SNAPSHOT</version>
<version>22.08.0-SNAPSHOT</version>

<properties>
<maven.compiler.source>1.8</maven.compiler.source>
Expand Down
1 change: 1 addition & 0 deletions examples/UDF-Examples/Spark-cuSpatial/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
ARG CUDA_VER=11.2.2
FROM nvidia/cuda:${CUDA_VER}-devel-ubuntu18.04

RUN apt-key adv --fetch-keys https://developer.download.nvidia.cn/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub
RUN apt-get update
RUN apt-get install -y wget ninja-build git

Expand Down
4 changes: 2 additions & 2 deletions examples/UDF-Examples/Spark-cuSpatial/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ or you can build it [in local](#build-in-local) machine after some prerequisites
3. Download spark-rapids jars
* [spark-rapids v22.06.0](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/22.06.0/rapids-4-spark_2.12-22.06.0.jar) or above
4. Prepare the dataset & jars. Copy the sample dataset from [cuspatial_data](../../../datasets/cuspatial_data.tar.gz) to "/data/cuspatial_data".
Copy spark-rapids & spark-cuspatial-22.06.0-SNAPSHOT.jar to "/data/cuspatial_data/jars".
Copy spark-rapids & spark-cuspatial-22.08.0-SNAPSHOT.jar to "/data/cuspatial_data/jars".
You can use your own path, but remember to update the paths in "gpu-run.sh" accordingly.
5. Run "gpu-run.sh"
```Bash
Expand Down Expand Up @@ -103,5 +103,5 @@ or you can build it [in local](#build-in-local) machine after some prerequisites
points
polygons
```
4. Import the Library "spark-cuspatial-22.06.0-SNAPSHOT.jar" to the Databricks, then install it to your cluster.
4. Import the Library "spark-cuspatial-22.08.0-SNAPSHOT.jar" to the Databricks, then install it to your cluster.
5. Import [cuspatial_sample.ipynb](notebooks/cuspatial_sample_db.ipynb) to your workspace in Databricks. Attach to your cluster, then run it.
2 changes: 1 addition & 1 deletion examples/UDF-Examples/Spark-cuSpatial/gpu-run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ rm -rf $DATA_OUT_PATH
# the path to keep the jars of spark-rapids & spark-cuspatial
JARS=$ROOT_PATH/jars

JARS_PATH=$JARS/rapids-4-spark_2.12-22.06.0.jar,$JARS/spark-cuspatial-22.06.0-SNAPSHOT.jar
JARS_PATH=$JARS/rapids-4-spark_2.12-22.06.0.jar,$JARS/spark-cuspatial-22.08.0-SNAPSHOT.jar

$SPARK_HOME/bin/spark-submit --master spark://$HOSTNAME:7077 \
--name "Gpu Spatial Join UDF" \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"source": [
"from pyspark.sql import SparkSession\n",
"spark = SparkSession.builder \\\n",
" .config(\"spark.jars\", \"/data/cuspatial_data/jars/rapids-4-spark_2.12-22.06.0.jar,/data/cuspatial_data/jars/spark-cuspatial-22.06.0-SNAPSHOT.jar\") \\\n",
" .config(\"spark.jars\", \"/data/cuspatial_data/jars/rapids-4-spark_2.12-22.06.0.jar,/data/cuspatial_data/jars/spark-cuspatial-22.08.0-SNAPSHOT.jar\") \\\n",
" .config(\"spark.sql.adaptive.enabled\", \"false\") \\\n",
" .config(\"spark.executor.memory\", \"20GB\") \\\n",
" .config(\"spark.executor.cores\", \"6\") \\\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/UDF-Examples/Spark-cuSpatial/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
<name>UDF of the cuSpatial case for the RAPIDS Accelerator</name>
<description>The RAPIDS accelerated user defined function of the cuSpatial case
for use with the RAPIDS Accelerator for Apache Spark</description>
<version>22.06.0-SNAPSHOT</version>
<version>22.08.0-SNAPSHOT</version>

<properties>
<maven.compiler.source>1.8</maven.compiler.source>
Expand Down

0 comments on commit b44cfac

Please sign in to comment.