You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/sql-ref-ansi-compliance.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,10 +29,10 @@ The following subsections present behaviour changes in arithmetic operations, ty
29
29
30
30
### Arithmetic Operations
31
31
32
-
In Spark SQL, arithmetic operations performed on numeric types (with the exception of decimal) are not checked for overflow by default.
33
-
This means that in case an operation causes an overflow, the result is the same that the same operation returns in a Java/Scala program (e.g., if the sum of 2 integers is higher than the maximum value representable, the result is a negative number).
34
-
On the other hand, Spark SQL returns null for decimal overflow.
35
-
When `spark.sql.ansi.enabled` is set to `true` and overflow occurs in numeric and interval arithmetic operations, it throws an arithmetic exception at runtime.
32
+
In Spark SQL, arithmetic operations performed on numeric types (with the exception of decimal) are not checked for overflows by default.
33
+
This means that in case an operation causes overflows, the result is the same that the same operation returns in a Java/Scala program (e.g., if the sum of 2 integers is higher than the maximum value representable, the result is a negative number).
34
+
On the other hand, Spark SQL returns null for decimal overflows.
35
+
When `spark.sql.ansi.enabled` is set to `true` and an overflow occurs in numeric and interval arithmetic operations, it throws an arithmetic exception at runtime.
36
36
37
37
{% highlight sql %}
38
38
-- `spark.sql.ansi.enabled=true`
@@ -54,7 +54,7 @@ SELECT 2147483647 + 1;
54
54
### Type Conversion
55
55
56
56
Spark SQL has three kinds of type conversions: explicit casting, type coercion, and store assignment casting.
57
-
When `spark.sql.ansi.enabled` is set to `true`, explicit castings by `CAST` syntax throws a number-format exception at runtime for illegal cast patterns defined in the standard, e.g. casts from a string to an integer.
57
+
When `spark.sql.ansi.enabled` is set to `true`, explicit casting by `CAST` syntax throws a number-format exception at runtime for illegal cast patterns defined in the standard, e.g. casts from a string to an integer.
58
58
On the other hand, `INSERT INTO` syntax throws an analysis exception when the ANSI mode enabled via `spark.sql.storeAssignmentPolicy=ANSI`.
59
59
60
60
Currently, the ANSI mode affects explicit casting and assignment casting only.
@@ -68,7 +68,7 @@ SELECT CAST('a' AS INT);
68
68
69
69
java.lang.NumberFormatException: invalid input syntax for type numeric: a
70
70
71
-
-- `spark.sql.ansi.enabled=false` (This is a legacy behaviour until Spark 2.x)
71
+
-- `spark.sql.ansi.enabled=false` (This is a default behaviour)
0 commit comments