You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to read data into PySpark using this connector and I'm getting a warning / error when I try to do any sort of filtering, either in the initial select statement or once I have loaded the data.
WARN DataStreamClient: Query insert into external table select <variable> from <table> where "<variable>" IS NOT NULL and "<variable>" > ? has failed. java.sql.SQLException: line 1, Parameter markers not allowed in this statement.
I've followed the example given in the README and this throws after loading the data:
df = sqlContext.sql("select * from vector_table") df.filter(df.total_amount > 10000).count()
Any advice on how to fix this would be greatly appreciated
The text was updated successfully, but these errors were encountered:
Which version of the spark-vector connector are you using with which version of spark?
It looks like you are using Spark 1.x syntax in the example.
Previously there was an error related to certain filtering queries when performed on columns that allow nulls. Does the query work if the columns in the table do not allow nulls?
I'm trying to read data into PySpark using this connector and I'm getting a warning / error when I try to do any sort of filtering, either in the initial select statement or once I have loaded the data.
WARN DataStreamClient: Query insert into external table select <variable> from <table> where "<variable>" IS NOT NULL and "<variable>" > ? has failed.
java.sql.SQLException: line 1, Parameter markers not allowed in this statement.
I've followed the example given in the README and this throws after loading the data:
df = sqlContext.sql("select * from vector_table")
df.filter(df.total_amount > 10000).count()
Any advice on how to fix this would be greatly appreciated
The text was updated successfully, but these errors were encountered: