You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Im uploading some dataframes from R to big query. When using bigquery package I had no issues.
When I upload with BigQueryR I have problems with special characters. Is there any configuration I should do?
I see in the source code that there is an encoding being applied at the step of the CSV creation.
I imagine bigrquery is using the more performant BigQuery storage API and is in general more maintained than this library. Is there a reason you are using bigQueryR vs bigrquery?
Hi,
Im uploading some dataframes from R to big query. When using bigquery package I had no issues.
When I upload with BigQueryR I have problems with special characters. Is there any configuration I should do?
I see in the source code that there is an encoding being applied at the step of the CSV creation.
bqr_upload_data(projectId = billing, datasetId = clientname, tableId = 'Lancamentos_Financeiros', upload_data = lancamentosFinanceiros , create = c("CREATE_IF_NEEDED"), schema = NULL, wait = F, autodetect = TRUE, nullMarker = NULL, maxBadRecords = NULL, allowJaggedRows = FALSE, allowQuotedNewlines = FALSE, fieldDelimiter = ",")
Also. bqr_upload_data is significantly slower than bq_perform_upload. Is there something I can set to reduce upload times?
The text was updated successfully, but these errors were encountered: