Skip to content

Conversation

@wangyum
Copy link
Member

@wangyum wangyum commented Dec 9, 2021

Backport #34811

What changes were proposed in this pull request?

Fix cast string type to decimal type only if spark.sql.legacy.allowNegativeScaleOfDecimal is enabled. For example:

import org.apache.spark.sql.types._
import org.apache.spark.sql.Row

spark.conf.set("spark.sql.legacy.allowNegativeScaleOfDecimal", true)
val data = Seq(Row("7.836725755512218E38"))
val schema = StructType(Array(StructField("a", StringType, false)))
val df =spark.createDataFrame(spark.sparkContext.parallelize(data), schema)
df.select(col("a").cast(DecimalType(37,-17))).show

The result is null since SPARK-32706.

Why are the changes needed?

Fix regression bug.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Unit test.

…legacy.allowNegativeScaleOfDecimal is enabled

### What changes were proposed in this pull request?

Fix cast string type to decimal type only if `spark.sql.legacy.allowNegativeScaleOfDecimal` is enabled. For example:
```scala
import org.apache.spark.sql.types._
import org.apache.spark.sql.Row

spark.conf.set("spark.sql.legacy.allowNegativeScaleOfDecimal", true)
val data = Seq(Row("7.836725755512218E38"))
val schema = StructType(Array(StructField("a", StringType, false)))
val df =spark.createDataFrame(spark.sparkContext.parallelize(data), schema)
df.select(col("a").cast(DecimalType(37,-17))).show
```

The result is null since [SPARK-32706](https://issues.apache.org/jira/browse/SPARK-32706).

### Why are the changes needed?

Fix regression bug.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Unit test.

Closes #34811 from wangyum/SPARK-37451.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>

(cherry picked from commit a1214a9)
@SparkQA
Copy link

SparkQA commented Dec 9, 2021

Test build #146039 has finished for PR 34851 at commit d1ecdf5.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 9, 2021

Kubernetes integration test unable to build dist.

exiting with code: 1
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50514/

@SparkQA
Copy link

SparkQA commented Dec 9, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50515/

@SparkQA
Copy link

SparkQA commented Dec 9, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/50515/

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you, @wangyum .
Merged to branch-3.1.

dongjoon-hyun pushed a commit that referenced this pull request Dec 9, 2021
….sql.legacy.allowNegativeScaleOfDecimal is enabled

Backport #34811

### What changes were proposed in this pull request?

Fix cast string type to decimal type only if `spark.sql.legacy.allowNegativeScaleOfDecimal` is enabled. For example:
```scala
import org.apache.spark.sql.types._
import org.apache.spark.sql.Row

spark.conf.set("spark.sql.legacy.allowNegativeScaleOfDecimal", true)
val data = Seq(Row("7.836725755512218E38"))
val schema = StructType(Array(StructField("a", StringType, false)))
val df =spark.createDataFrame(spark.sparkContext.parallelize(data), schema)
df.select(col("a").cast(DecimalType(37,-17))).show
```

The result is null since [SPARK-32706](https://issues.apache.org/jira/browse/SPARK-32706).

### Why are the changes needed?

Fix regression bug.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Unit test.

Closes #34851 from wangyum/SPARK-37451-branch-3.1.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
@SparkQA
Copy link

SparkQA commented Dec 9, 2021

Test build #146040 has finished for PR 34851 at commit 894cf15.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@wangyum wangyum deleted the SPARK-37451-branch-3.1 branch December 10, 2021 00:15
fishcus pushed a commit to fishcus/spark that referenced this pull request Jan 12, 2022
….sql.legacy.allowNegativeScaleOfDecimal is enabled

Backport apache#34811

### What changes were proposed in this pull request?

Fix cast string type to decimal type only if `spark.sql.legacy.allowNegativeScaleOfDecimal` is enabled. For example:
```scala
import org.apache.spark.sql.types._
import org.apache.spark.sql.Row

spark.conf.set("spark.sql.legacy.allowNegativeScaleOfDecimal", true)
val data = Seq(Row("7.836725755512218E38"))
val schema = StructType(Array(StructField("a", StringType, false)))
val df =spark.createDataFrame(spark.sparkContext.parallelize(data), schema)
df.select(col("a").cast(DecimalType(37,-17))).show
```

The result is null since [SPARK-32706](https://issues.apache.org/jira/browse/SPARK-32706).

### Why are the changes needed?

Fix regression bug.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Unit test.

Closes apache#34851 from wangyum/SPARK-37451-branch-3.1.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants