Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-43009][SQL][3.4] Parameterized sql() with Any constants #40666

Closed

Conversation

MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Apr 4, 2023

What changes were proposed in this pull request?

In the PR, I propose to change API of parameterized SQL, and replace type of argument values from string to Any in Scala/Java/Python and Expression.Literal in protobuf API. Language API can accept Any objects from which it is possible to construct literal expressions.

This is a backport of #40623

Scala/Java:

  def sql(sqlText: String, args: Map[String, Any]): DataFrame

values of the args map are wrapped by the lit() function which leaves Column as is and creates a literal from other Java/Scala objects (for more details see the Scala tab at https://spark.apache.org/docs/latest/sql-ref-datatypes.html).

Python:

def sql(self, sqlQuery: str, args: Optional[Dict[str, Any]] = None, **kwargs: Any) -> DataFrame:

Similarly to Scala/Java sql, Python's sql() accepts Python objects as values of the args dictionary (see more details about acceptable Python objects at https://spark.apache.org/docs/latest/sql-ref-datatypes.html). sql() converts dictionary values to Column literal expressions by lit().

Protobuf:

message SqlCommand {
  // (Required) SQL Query.
  string sql = 1;

  // (Optional) A map of parameter names to literal expressions.
  map<string, Expression.Literal> args = 2;
}

For example:

scala> val sqlText = """SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name"""
sqlText: String = SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name

scala> sql(sqlText, args = Map("player_name" -> lit("E'Twaun Moore"))).show(false)
+-------------+
|s            |
+-------------+
|E'Twaun Moore|
+-------------+

Why are the changes needed?

The current implementation the parameterized sql() requires arguments as string values parsed to SQL literal expressions that causes the following issues:

  1. SQL comments are skipped while parsing, so, some fragments of input might be skipped. For example, 'Europe -- Amsterdam'. In this case, -- Amsterdam is excluded from the input.
  2. Special chars in string values must be escaped, for instance 'E\'Twaun Moore'

Does this PR introduce any user-facing change?

No since the parameterized SQL feature #38864 hasn't been released yet.

How was this patch tested?

By running the affected tests:

$ build/sbt "test:testOnly *ParametersSuite"
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.tests.connect.test_connect_basic SparkConnectBasicTests.test_sql_with_args'
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.session SparkSession.sql'

Authored-by: Max Gekk max.gekk@gmail.com
(cherry picked from commit 156a12e)

In the PR, I propose to change API of parameterized SQL, and replace type of argument values from `string` to `Any` in Scala/Java/Python and `Expression.Literal` in protobuf API. Language API can accept `Any` objects from which it is possible to construct literal expressions.

```scala
  def sql(sqlText: String, args: Map[String, Any]): DataFrame
```
values of the `args` map are wrapped by the `lit()` function which leaves `Column` as is and creates a literal from other Java/Scala objects (for more details see the `Scala` tab at https://spark.apache.org/docs/latest/sql-ref-datatypes.html).

```python
def sql(self, sqlQuery: str, args: Optional[Dict[str, Any]] = None, **kwargs: Any) -> DataFrame:
```
Similarly to Scala/Java `sql`, Python's `sql()` accepts Python objects as values of the `args` dictionary (see more details about acceptable Python objects at https://spark.apache.org/docs/latest/sql-ref-datatypes.html). `sql()` converts dictionary values to `Column` literal expressions by `lit()`.

```proto
message SqlCommand {
  // (Required) SQL Query.
  string sql = 1;

  // (Optional) A map of parameter names to literal expressions.
  map<string, Expression.Literal> args = 2;
}
```

For example:
```scala
scala> val sqlText = """SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name"""
sqlText: String = SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name

scala> sql(sqlText, args = Map("player_name" -> lit("E'Twaun Moore"))).show(false)
+-------------+
|s            |
+-------------+
|E'Twaun Moore|
+-------------+
```

The current implementation the parameterized `sql()` requires arguments as string values parsed to SQL literal expressions that causes the following issues:
1. SQL comments are skipped while parsing, so, some fragments of input might be skipped. For example, `'Europe -- Amsterdam'`. In this case, `-- Amsterdam` is excluded from the input.
2. Special chars in string values must be escaped, for instance `'E\'Twaun Moore'`

No since the parameterized SQL feature apache#38864 hasn't been released yet.

By running the affected tests:
```
$ build/sbt "test:testOnly *ParametersSuite"
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.tests.connect.test_connect_basic SparkConnectBasicTests.test_sql_with_args'
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.session SparkSession.sql'
```

Closes apache#40623 from MaxGekk/parameterized-sql-any.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
(cherry picked from commit 156a12e)
Signed-off-by: Max Gekk <max.gekk@gmail.com>
@HyukjinKwon
Copy link
Member

Merged to branch-3.4.

HyukjinKwon pushed a commit that referenced this pull request Apr 5, 2023
### What changes were proposed in this pull request?
In the PR, I propose to change API of parameterized SQL, and replace type of argument values from `string` to `Any` in Scala/Java/Python and `Expression.Literal` in protobuf API. Language API can accept `Any` objects from which it is possible to construct literal expressions.

This is a backport of #40623

#### Scala/Java:

```scala
  def sql(sqlText: String, args: Map[String, Any]): DataFrame
```
values of the `args` map are wrapped by the `lit()` function which leaves `Column` as is and creates a literal from other Java/Scala objects (for more details see the `Scala` tab at https://spark.apache.org/docs/latest/sql-ref-datatypes.html).

#### Python:

```python
def sql(self, sqlQuery: str, args: Optional[Dict[str, Any]] = None, **kwargs: Any) -> DataFrame:
```
Similarly to Scala/Java `sql`, Python's `sql()` accepts Python objects as values of the `args` dictionary (see more details about acceptable Python objects at https://spark.apache.org/docs/latest/sql-ref-datatypes.html). `sql()` converts dictionary values to `Column` literal expressions by `lit()`.

#### Protobuf:

```proto
message SqlCommand {
  // (Required) SQL Query.
  string sql = 1;

  // (Optional) A map of parameter names to literal expressions.
  map<string, Expression.Literal> args = 2;
}
```

For example:
```scala
scala> val sqlText = """SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name"""
sqlText: String = SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name

scala> sql(sqlText, args = Map("player_name" -> lit("E'Twaun Moore"))).show(false)
+-------------+
|s            |
+-------------+
|E'Twaun Moore|
+-------------+
```

### Why are the changes needed?
The current implementation the parameterized `sql()` requires arguments as string values parsed to SQL literal expressions that causes the following issues:
1. SQL comments are skipped while parsing, so, some fragments of input might be skipped. For example, `'Europe -- Amsterdam'`. In this case, `-- Amsterdam` is excluded from the input.
2. Special chars in string values must be escaped, for instance `'E\'Twaun Moore'`

### Does this PR introduce _any_ user-facing change?
No since the parameterized SQL feature #38864 hasn't been released yet.

### How was this patch tested?
By running the affected tests:
```
$ build/sbt "test:testOnly *ParametersSuite"
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.tests.connect.test_connect_basic SparkConnectBasicTests.test_sql_with_args'
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.session SparkSession.sql'
```

Authored-by: Max Gekk <max.gekkgmail.com>
(cherry picked from commit 156a12e)

Closes #40666 from MaxGekk/parameterized-sql-any-3.4-2.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
@HyukjinKwon HyukjinKwon closed this Apr 5, 2023
snmvaughan pushed a commit to snmvaughan/spark that referenced this pull request Jun 20, 2023
### What changes were proposed in this pull request?
In the PR, I propose to change API of parameterized SQL, and replace type of argument values from `string` to `Any` in Scala/Java/Python and `Expression.Literal` in protobuf API. Language API can accept `Any` objects from which it is possible to construct literal expressions.

This is a backport of apache#40623

#### Scala/Java:

```scala
  def sql(sqlText: String, args: Map[String, Any]): DataFrame
```
values of the `args` map are wrapped by the `lit()` function which leaves `Column` as is and creates a literal from other Java/Scala objects (for more details see the `Scala` tab at https://spark.apache.org/docs/latest/sql-ref-datatypes.html).

#### Python:

```python
def sql(self, sqlQuery: str, args: Optional[Dict[str, Any]] = None, **kwargs: Any) -> DataFrame:
```
Similarly to Scala/Java `sql`, Python's `sql()` accepts Python objects as values of the `args` dictionary (see more details about acceptable Python objects at https://spark.apache.org/docs/latest/sql-ref-datatypes.html). `sql()` converts dictionary values to `Column` literal expressions by `lit()`.

#### Protobuf:

```proto
message SqlCommand {
  // (Required) SQL Query.
  string sql = 1;

  // (Optional) A map of parameter names to literal expressions.
  map<string, Expression.Literal> args = 2;
}
```

For example:
```scala
scala> val sqlText = """SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name"""
sqlText: String = SELECT s FROM VALUES ('Jeff /*__*/ Green'), ('E\'Twaun Moore') AS t(s) WHERE s = :player_name

scala> sql(sqlText, args = Map("player_name" -> lit("E'Twaun Moore"))).show(false)
+-------------+
|s            |
+-------------+
|E'Twaun Moore|
+-------------+
```

### Why are the changes needed?
The current implementation the parameterized `sql()` requires arguments as string values parsed to SQL literal expressions that causes the following issues:
1. SQL comments are skipped while parsing, so, some fragments of input might be skipped. For example, `'Europe -- Amsterdam'`. In this case, `-- Amsterdam` is excluded from the input.
2. Special chars in string values must be escaped, for instance `'E\'Twaun Moore'`

### Does this PR introduce _any_ user-facing change?
No since the parameterized SQL feature apache#38864 hasn't been released yet.

### How was this patch tested?
By running the affected tests:
```
$ build/sbt "test:testOnly *ParametersSuite"
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.tests.connect.test_connect_basic SparkConnectBasicTests.test_sql_with_args'
$ python/run-tests --parallelism=1 --testnames 'pyspark.sql.session SparkSession.sql'
```

Authored-by: Max Gekk <max.gekkgmail.com>
(cherry picked from commit 156a12e)

Closes apache#40666 from MaxGekk/parameterized-sql-any-3.4-2.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants