You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary
My databend workflow is very easy to use; I create a pandas dataframe of the data, I call df.to_parquet to create an in-memory Parquet object, then I use the streaming_load http endpoint to upload it. This mostly just works and with better error msgs than most databases support when doing native SQL inserts.
However, with a "JSON" column in my table, I can't figure out how to insert this way. If I have a column that in python is a dictionary, that results in a 'Table' type in the parquet file. Uploading that yields this error:
It would be nice if the Tuple type were automatically walked and turned into a dictionary. Or if it were a string that was valid json, just take that, though that is a little bit of a departure from the usual typesafety of databend, it sure would be convenient.
It seems Snowflake has a similar problem, which this user solved by modifying his INSERT statement. I am not sure how I could downcast within the INSERT statement sent to streaming_load, though I tried. https://stackoverflow.com/q/70984773
The text was updated successfully, but these errors were encountered:
Summary
My databend workflow is very easy to use; I create a pandas dataframe of the data, I call df.to_parquet to create an in-memory Parquet object, then I use the streaming_load http endpoint to upload it. This mostly just works and with better error msgs than most databases support when doing native SQL inserts.
However, with a "JSON" column in my table, I can't figure out how to insert this way. If I have a column that in python is a dictionary, that results in a 'Table' type in the parquet file. Uploading that yields this error:
If I instead write the column as a json-encoded string, I get this error:
It would be nice if the Tuple type were automatically walked and turned into a dictionary. Or if it were a string that was valid json, just take that, though that is a little bit of a departure from the usual typesafety of databend, it sure would be convenient.
It seems Snowflake has a similar problem, which this user solved by modifying his INSERT statement. I am not sure how I could downcast within the INSERT statement sent to streaming_load, though I tried.
https://stackoverflow.com/q/70984773
The text was updated successfully, but these errors were encountered: