Skip to content

Conversation

@LuciferYang
Copy link
Contributor

What changes were proposed in this pull request?

Why are the changes needed?

Does this PR introduce any user-facing change?

How was this patch tested?

</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-connect_${scala.binary.version}</artifactId>
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be <scope>test</scope>

<scope>test</scope>
<version>${project.version}</version>
</dependency>
<dependency>
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be <scope>test</scope>


object SparkConnectClientTests {
lazy val settings = Seq(
excludeDependencies ++= {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

excludeDependencies does not work. There is a probability that sbt will fail to compile because spark-sql is in the classpath, which will cause package conflicts as follows:

[error] /${basedir}/spark-mine/connector/connect/client/integration-tests/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:27:21: value collectResult is not a member of org.apache.spark.sql.DataFrame
[error]     val schema = df.collectResult().schema
[error]                     ^
[error] /${basedir}/spark-mine/connector/connect/client/integration-tests/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:33:21: value collectResult is not a member of org.apache.spark.sql.DataFrame
[error]     val result = df.collectResult()
[error]                     ^
[error] /${basedir}/spark-mine/connector/connect/client/integration-tests/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:43:21: value collectResult is not a member of org.apache.spark.sql.Dataset[Long]
[error]     val result = df.collectResult()
[error]                     ^
[error] /${basedir}/spark-mine/connector/connect/client/integration-tests/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:152:36: value client is not a member of org.apache.spark.sql.SparkSession.Builder
[error]     spark = SparkSession.builder().client(SparkConnectClient.builder().port(port).build()).build()
[error]                                    ^
[error] /${basedir}/spark-mine/connector/connect/client/integration-tests/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:165:12: value collectResult is not a member of org.apache.spark.sql.DataFrame
[error] possible cause: maybe a semicolon is missing before `value collectResult'?
[error]           .collectResult()
[error]            ^
[error] 5 errors found
[error] (connect-client-integration-tests / Test / compileIncremental) Compilation failed
[error] Total time: 47 s, completed 2023-1-29 12:41:23

I can't solve it temporarily

@LuciferYang
Copy link
Contributor Author

LuciferYang commented Jan 29, 2023

Just trying. Currently, I haven't fully figured out how to do it better

@LuciferYang
Copy link
Contributor Author

close this one and open #39807

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant