Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -39,14 +39,15 @@ import org.apache.spark.sql.connect.client.util.IntegrationTestUtils._
* spark-sql
* spark-connect-client-jvm
* }}}
* To build the above artifact, use e.g. `sbt package` or `mvn clean install -DskipTests`.
* To build the above artifact, use e.g. `build/sbt package` or
* `build/mvn clean install -DskipTests`.
*
* When debugging this test, if any changes to the client API, the client jar need to be built
* before running the test. An example workflow with SBT for this test:
* 1. Compatibility test has reported an unexpected client API change.
* 1. Fix the wrong client API.
* 1. Build the client jar: `sbt package`
* 1. Run the test again: `sbt "testOnly
* 1. Build the client jar: `build/sbt package`
* 1. Run the test again: `build/sbt "testOnly
* org.apache.spark.sql.connect.client.CompatibilitySuite"`
*/
class CompatibilitySuite extends ConnectFunSuite {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,9 @@ object IntegrationTestUtils {
private[connect] def debug(error: Throwable): Unit = if (isDebug) error.printStackTrace()

/**
* Find a jar in the Spark project artifacts. It requires a build first (e.g. sbt package, mvn
* clean install -DskipTests) so that this method can find the jar in the target folders.
* Find a jar in the Spark project artifacts. It requires a build first (e.g. build/sbt package,
* build/mvn clean install -DskipTests) so that this method can find the jar in the target
* folders.
*
* @return
* the jar
Expand All @@ -52,7 +53,7 @@ object IntegrationTestUtils {
targetDir.exists(),
s"Fail to locate the target folder: '${targetDir.getCanonicalPath}'. " +
s"SPARK_HOME='${new File(sparkHome).getCanonicalPath}'. " +
"Make sure the spark project jars has been built (e.g. using sbt package)" +
"Make sure the spark project jars has been built (e.g. using build/sbt package)" +
"and the env variable `SPARK_HOME` is set correctly.")
val jars = recursiveListFiles(targetDir).filter { f =>
// SBT jar
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,8 @@ import org.apache.spark.util.Utils

/**
* An util class to start a local spark connect server in a different process for local E2E tests.
* Pre-running the tests, the spark connect artifact needs to be built using e.g. `sbt package`.
* Pre-running the tests, the spark connect artifact needs to be built using e.g.
* `build/sbt package`.
* It is designed to start the server once but shared by all tests. It is equivalent to use the
* following command to start the connect server via command line:
*
Expand Down