You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 18, 2023. It is now read-only.
Describe the bug
The issue seems due to some wrong escape on special characters
java.lang.RuntimeException: Opening HDFS file '/user/sparkuser/t_tpch_parquet_100/part/p_brand=Brand%252311/part-00000-21a924cf-440b-416a-a2f6-5a7ac39d960e.c000.snappy.parquet' failed
at org.apache.arrow.dataset.jni.JniWrapper.inspectSchema(Native Method)
at org.apache.arrow.dataset.jni.NativeDatasetFactory.inspect(NativeDatasetFactory.java:63)
at com.intel.oap.spark.sql.execution.datasources.v2.arrow.ArrowUtils$.readSchema(ArrowUtils.scala:47)
at com.intel.oap.spark.sql.execution.datasources.v2.arrow.ArrowUtils$.readSchema(ArrowUtils.scala:59)
at com.intel.oap.spark.sql.execution.datasources.arrow.ArrowFileFormat.convert(ArrowFileFormat.scala:59)
at com.intel.oap.spark.sql.execution.datasources.arrow.ArrowFileFormat.inferSchema(ArrowFileFormat.scala:65)
To Reproduce
create TPCH partitioned tables in hive metastore
Expected behavior
create table successful
Additional context
N/A
The text was updated successfully, but these errors were encountered:
Describe the bug
The issue seems due to some wrong escape on special characters
To Reproduce
create TPCH partitioned tables in hive metastore
Expected behavior
create table successful
Additional context
N/A
The text was updated successfully, but these errors were encountered: