Skip to content

Conversation

@ueshin
Copy link
Member

@ueshin ueshin commented Feb 24, 2023

What changes were proposed in this pull request?

Fixes SparkConnectStreamHandler to handle configs properly while planning.

The whole process should be done in session.withActive to take the proper SQLConf into account.

Why are the changes needed?

Some components for planning need to check configs in SQLConf.get while building the plan, but currently it's unavailable.

For example, spark.sql.legacy.allowNegativeScaleOfDecimal needs to check when construct DecimalType but it's not set while planning, thus it causes an error when trying to cast to DecimalType(1, -1) with the config set to "true":

[INTERNAL_ERROR] Negative scale is not allowed: -1. Set the config "spark.sql.legacy.allowNegativeScaleOfDecimal" to "true" to allow it.

Does this PR introduce any user-facing change?

The configs will take effect while planning.

How was this patch tested?

Enabled a related test.

@HyukjinKwon
Copy link
Member

Merged to master and branch-3.4.

HyukjinKwon pushed a commit that referenced this pull request Feb 25, 2023
…s properly while planning

### What changes were proposed in this pull request?

Fixes `SparkConnectStreamHandler` to handle configs properly while planning.

The whole process should be done in `session.withActive` to take the proper `SQLConf` into account.

### Why are the changes needed?

Some components for planning need to check configs in `SQLConf.get` while building the plan, but currently it's unavailable.

For example, `spark.sql.legacy.allowNegativeScaleOfDecimal` needs to check when construct `DecimalType` but it's not set while planning, thus it causes an error when trying to cast to `DecimalType(1, -1)` with the config set to `"true"`:

```
[INTERNAL_ERROR] Negative scale is not allowed: -1. Set the config "spark.sql.legacy.allowNegativeScaleOfDecimal" to "true" to allow it.
```

### Does this PR introduce _any_ user-facing change?

The configs will take effect while planning.

### How was this patch tested?

Enabled a related test.

Closes #40165 from ueshin/issues/SPARK-42568/withActive.

Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit b994353)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
zhengruifeng pushed a commit that referenced this pull request Mar 3, 2023
…e withActive

### What changes were proposed in this pull request?

Fix `SparkConnectAnalyzeHandler` to use `withActive`.

### Why are the changes needed?

Similar to #40165, `SQLConf.get` is necessary when transforming the proto to plans.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Existing tests.

Closes #40261 from ueshin/issues/SPARK-42615/withActive.

Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: Ruifeng Zheng <ruifengz@apache.org>
zhengruifeng pushed a commit that referenced this pull request Mar 3, 2023
…e withActive

### What changes were proposed in this pull request?

Fix `SparkConnectAnalyzeHandler` to use `withActive`.

### Why are the changes needed?

Similar to #40165, `SQLConf.get` is necessary when transforming the proto to plans.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Existing tests.

Closes #40261 from ueshin/issues/SPARK-42615/withActive.

Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: Ruifeng Zheng <ruifengz@apache.org>
(cherry picked from commit 58f2897)
Signed-off-by: Ruifeng Zheng <ruifengz@apache.org>
snmvaughan pushed a commit to snmvaughan/spark that referenced this pull request Jun 20, 2023
…s properly while planning

### What changes were proposed in this pull request?

Fixes `SparkConnectStreamHandler` to handle configs properly while planning.

The whole process should be done in `session.withActive` to take the proper `SQLConf` into account.

### Why are the changes needed?

Some components for planning need to check configs in `SQLConf.get` while building the plan, but currently it's unavailable.

For example, `spark.sql.legacy.allowNegativeScaleOfDecimal` needs to check when construct `DecimalType` but it's not set while planning, thus it causes an error when trying to cast to `DecimalType(1, -1)` with the config set to `"true"`:

```
[INTERNAL_ERROR] Negative scale is not allowed: -1. Set the config "spark.sql.legacy.allowNegativeScaleOfDecimal" to "true" to allow it.
```

### Does this PR introduce _any_ user-facing change?

The configs will take effect while planning.

### How was this patch tested?

Enabled a related test.

Closes apache#40165 from ueshin/issues/SPARK-42568/withActive.

Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit b994353)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
snmvaughan pushed a commit to snmvaughan/spark that referenced this pull request Jun 20, 2023
…e withActive

### What changes were proposed in this pull request?

Fix `SparkConnectAnalyzeHandler` to use `withActive`.

### Why are the changes needed?

Similar to apache#40165, `SQLConf.get` is necessary when transforming the proto to plans.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Existing tests.

Closes apache#40261 from ueshin/issues/SPARK-42615/withActive.

Authored-by: Takuya UESHIN <ueshin@databricks.com>
Signed-off-by: Ruifeng Zheng <ruifengz@apache.org>
(cherry picked from commit 58f2897)
Signed-off-by: Ruifeng Zheng <ruifengz@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants