You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What went wrong?
Right now on the master, we cannot add new elements to a qbeast table through the INSERT INTO sql command
How to reproduce?
Code that triggered the bug, or steps to reproduce:
On a previously indexed dataframe named df:
df.createOrReplaceTempView("t")
spark.sql("insert into table t (value) values (4)")
And the output is:
org.apache.spark.sql.AnalysisException: QbeastBaseRelation(parquet) does not allow insertion.;
'InsertIntoStatement Relation[value#1297] QbeastBaseRelation(parquet), false, false
+- LocalRelation [col1#1543]
at org.apache.spark.sql.execution.datasources.PreWriteCheck$.failAnalysis(rules.scala:512)
Branch and commit id:
master ef6b9f45ac54ab156e5b3474e3014b639b2ac827
Spark version:
3.x
Hadoop version:
2.4.7
Are you running Spark inside a container? Are you launching the app on a remote K8s cluster? Or are you just running the tests in a local computer?
Local
Stack trace:
The text was updated successfully, but these errors were encountered:
What went wrong?
Right now on the master, we cannot add new elements to a qbeast table through the INSERT INTO sql command
How to reproduce?
On a previously indexed dataframe named df:
And the output is:
master ef6b9f45ac54ab156e5b3474e3014b639b2ac827
Spark version:
3.x
Hadoop version:
2.4.7
Are you running Spark inside a container? Are you launching the app on a remote K8s cluster? Or are you just running the tests in a local computer?
Local
Stack trace:
The text was updated successfully, but these errors were encountered: