Skip to content

Commit

Permalink
Fix extrawhitespaces in README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
LucaCanali authored and ianawilson committed Dec 28, 2021
1 parent 2f9c8b2 commit 2efa19f
Showing 1 changed file with 5 additions and 6 deletions.
11 changes: 5 additions & 6 deletions spark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,21 +18,20 @@ limitations under the License.

# Apache HBase™ Spark Connector

## Spark, Scala and other configurable options
## Spark, Scala and Configurable Options

To generate an artifact for a different [Spark version](https://mvnrepository.com/artifact/org.apache.spark/spark-core) and/or [Scala version](https://www.scala-lang.org/download/all.html),
[Hadoop version](https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core), or [HBase version](https://mvnrepository.com/artifact/org.apache.hbase/hbase) pass command-line options as follows (changing version numbers appropriately):
To generate an artifact for a different [Spark version](https://mvnrepository.com/artifact/org.apache.spark/spark-core) and/or [Scala version](https://www.scala-lang.org/download/all.html),
[Hadoop version](https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core), or [HBase version](https://mvnrepository.com/artifact/org.apache.hbase/hbase), pass command-line options as follows (changing version numbers appropriately):

```
$ mvn -Dspark.version=3.1.2 -Dscala.version=2.12.10 -Dhadoop-three.version=3.2.0 -Dscala.binary.version=2.12 -Dhbase.version=2.4.8 clean install
```

Note: to build the connector with Spark 2.x, compile it with `-Dscala.binary.version=2.11` and use the profile `-Dhadoop.profile=2.0`

## Configuration and installation
## Configuration and Installation
**Client-side** (Spark) configuration:
- The HBase configuration file `hbase-site.xml` should be made available to Spark, it
can be copied to `$SPARK_CONF_DIR` (default is $SPARK_HOME/conf`)
- The HBase configuration file `hbase-site.xml` should be made available to Spark, it can be copied to `$SPARK_CONF_DIR` (default is $SPARK_HOME/conf`)

**Server-side** (HBase region servers) configuration:
- The following jars needs to be in the CLASSPATH of the HBase region servers:
Expand Down

0 comments on commit 2efa19f

Please sign in to comment.