diff --git a/README-CN.md b/README-CN.md index a060429d..6d56bf5b 100644 --- a/README-CN.md +++ b/README-CN.md @@ -37,6 +37,7 @@ Nebula Exchange 和 Nebula 的版本对应关系如下: | 2.1.0 | 2.0.0, 2.0.1 | | 2.5.0 | 2.5.0, 2.5.1 | | 2.5.1 | 2.5.0, 2.5.1 | +| 2.6.0 | 2.6.0 | | 2.5-SNAPSHOT | nightly | ## 使用说明 @@ -71,6 +72,15 @@ nebula-exchange-2.5.0.jar \ -c application.conf ``` +注:使用 Nebula Exchange 进行 SST 文件生成时,会涉及到 Spark 的 shuffle 操作,请注意在提交命令中增加 spark.sql.shuffle.partition 的配置: +``` +$SPARK_HOME/bin/spark-submit --class com.vesoft.nebula.exchange.Exchange \ +--master local \ +--conf spark.sql.shuffle.partitions=200 \ +nebula-exchange-2.5.0.jar \ +-c application.conf +``` + 关于 Nebula Exchange 的更多说明,请参考 Exchange 2.0 的[使用手册](https://docs.nebula-graph.com.cn/2.0.1/nebula-exchange/about-exchange/ex-ug-what-is-exchange/) 。 ## 贡献 diff --git a/README.md b/README.md index 4021d25b..6a78abf5 100644 --- a/README.md +++ b/README.md @@ -45,6 +45,15 @@ nebula-exchange-2.5.0.jar \ -c application.conf ``` +Note: When use Exchange to generate SST files, please add spark.sql.shuffle.partition config for Spark's shuffle operation: +``` +$SPARK_HOME/bin/spark-submit --class com.vesoft.nebula.exchange.Exchange \ +--master local \ +--conf spark.sql.shuffle.partitions=200 \ +nebula-exchange-2.5.0.jar \ +-c application.conf +``` + For more details about Exchange, please refer to [Exchange 2.0](https://docs.nebula-graph.io/2.0.1/16.eco-tools/1.nebula-exchange/) . ## Version match @@ -58,6 +67,7 @@ There are the version correspondence between Nebula Exchange and Nebula: | 2.1.0 | 2.0.0, 2.0.1 | | 2.5.0 | 2.5.0, 2.5.1 | | 2.5.1 | 2.5.0, 2.5.1 | +| 2.6.0 | 2.6.0 | | 2.5-SNAPSHOT | nightly | ## New Features diff --git a/nebula-exchange/src/main/scala/com/vesoft/nebula/exchange/Exchange.scala b/nebula-exchange/src/main/scala/com/vesoft/nebula/exchange/Exchange.scala index 64946490..7c55b3c3 100644 --- a/nebula-exchange/src/main/scala/com/vesoft/nebula/exchange/Exchange.scala +++ b/nebula-exchange/src/main/scala/com/vesoft/nebula/exchange/Exchange.scala @@ -75,7 +75,6 @@ object Exchange { .builder() .appName(PROGRAM_NAME) .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer") - .config("spark.sql.shuffle.partitions", "1") for (key <- configs.sparkConfigEntry.map.keySet) { session.config(key, configs.sparkConfigEntry.map(key))