Skip to content

Tmp mtmv multi pct tables#2

Merged
seawinde merged 84 commits intogzmf_pr_based_masterfrom
tmp_mtmv_multi_pct_tables
Oct 24, 2025
Merged

Tmp mtmv multi pct tables#2
seawinde merged 84 commits intogzmf_pr_based_masterfrom
tmp_mtmv_multi_pct_tables

Conversation

@seawinde
Copy link
Owner

Proposed changes

Issue Number: close #xxx

Further comments

If this is a relatively large or complex change, kick off the discussion at dev@doris.apache.org by explaining why you chose the solution you did and what alternatives you considered, etc...

@github-actions
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In bin/flight_record_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In bin/profile_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In build-support/clang-format.sh line 43:
    export PATH=$(brew --prefix llvm@16)/bin:$PATH
           ^--^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    export PATH=$(brew --prefix llvm@16)/bin:${PATH}


In build.sh line 236:
            BUILD_SPARK_DPP=1
            ^-------------^ SC2034 (warning): BUILD_SPARK_DPP appears unused. Verify use (or export if used externally).


In build.sh line 512:
FEAT+=($([[ -n "${WITH_TDE_DIR}" ]] && echo "+TDE" || echo "-TDE"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 513:
FEAT+=($([[ "${ENABLE_HDFS_STORAGE_VAULT:-OFF}" == "ON" ]] && echo "+HDFS_STORAGE_VAULT" || echo "-HDFS_STORAGE_VAULT"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 514:
FEAT+=($([[ ${BUILD_UI} -eq 1 ]] && echo "+UI" || echo "-UI"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 515:
FEAT+=($([[ "${BUILD_AZURE}" == "ON" ]] && echo "+AZURE_BLOB,+AZURE_STORAGE_VAULT" || echo "-AZURE_BLOB,-AZURE_STORAGE_VAULT"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 516:
FEAT+=($([[ ${BUILD_HIVE_UDF} -eq 1 ]] && echo "+HIVE_UDF" || echo "-HIVE_UDF"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 517:
FEAT+=($([[ ${BUILD_BE_JAVA_EXTENSIONS} -eq 1 ]] && echo "+BE_JAVA_EXTENSIONS" || echo "-BE_JAVA_EXTENSIONS"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 519:
export DORIS_FEATURE_LIST=$(IFS=','; echo "${FEAT[*]}")
       ^----------------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In build.sh line 671:
        -DENABLE_HDFS_STORAGE_VAULT=${ENABLE_HDFS_STORAGE_VAULT:-ON} \
                                    ^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        -DENABLE_HDFS_STORAGE_VAULT="${ENABLE_HDFS_STORAGE_VAULT:-ON}" \


In build.sh line 791:
    if [ "${TARGET_SYSTEM}" = "Darwin" ] || [ "${TARGET_SYSTEM}" = "Linux" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                            ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" = "Darwin" ]] || [[ "${TARGET_SYSTEM}" = "Linux" ]]; then


In build.sh line 926:
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'x86_64' ]]; then
                                                  ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'x86_64' ]]; then


In build.sh line 930:
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'aarch64' ]]; then
                                                    ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'aarch64' ]]; then


In cloud/script/run_all_tests.sh line 167:
exit ${ret}
     ^----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
exit "${ret}"


In cloud/script/start.sh line 59:
  source "${custom_start}" 
         ^---------------^ SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.


In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 166:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 167:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 173:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 209:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 210:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 24:
    [ -e "$file" ] || continue
    ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    [[ -e "${file}" ]] || continue


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 25:
    tar -xzvf "$file" -C "$AUX_LIB"
               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    tar -xzvf "${file}" -C "${AUX_LIB}"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 38:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
if [[ ${NEED_LOAD_DATA} = "0" ]]; then
      ^---------------^ SC2154 (warning): NEED_LOAD_DATA is referenced but not assigned.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 49:
if [[ ${enablePaimonHms} == "true" ]]; then
      ^----------------^ SC2154 (warning): enablePaimonHms is referenced but not assigned.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 54:
    echo "Script: create_paimon_table.hql executed in $EXECUTION_TIME seconds"
                                                      ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Script: create_paimon_table.hql executed in ${EXECUTION_TIME} seconds"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 64:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${LOAD_PARALLEL}" -I {} bash -ec '
                                                                      ^--------------^ SC2154 (warning): LOAD_PARALLEL is referenced but not assigned.
                                                                                                       ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 119:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${LOAD_PARALLEL}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                                    ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 22:
find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 | \
     ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
find "${CUR_DIR}"/data -type f -name "*.tar.gz" -print0 | \


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 23:
xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
                ^--------------^ SC2154 (warning): LOAD_PARALLEL is referenced but not assigned.
                                          ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 33:
    cd ${CUR_DIR}/
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 34:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/tpch1.db.tar.gz
                    ^-------------^ SC2154 (warning): s3BucketName is referenced but not assigned.
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2154 (warning): s3Endpoint is referenced but not assigned.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/tpch1.db.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 45:
    cd ${CUR_DIR}/
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 46:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/tvf_data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/tvf_data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 58:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 70:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 82:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 94:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 106:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/test_tvf/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/test_tvf/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 118:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 144:
cd ${CUR_DIR}/auxlib
   ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
cd "${CUR_DIR}"/auxlib


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 74:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 81:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 83:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 90:
if [[ ${enablePaimonHms} == "true" ]]; then
      ^----------------^ SC2154 (warning): enablePaimonHms is referenced but not assigned.


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 27:
echo "[polaris-init] Waiting for Polaris health check at http://$HOST:$PORT/q/health ..."
                                                                ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                      ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Waiting for Polaris health check at http://${HOST}:${PORT}/q/health ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 28:
for i in $(seq 1 120); do
^-^ SC2034 (warning): i appears unused. Verify use (or export if used externally).


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 29:
  if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  if curl -sSf "http://${HOST}:8182/q/health" >/dev/null; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 38:
  -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/catalog/v1/oauth/tokens" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 40:
  -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
                                              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -d "grant_type=client_credentials&client_id=${USER}&client_secret=${PASS}&scope=PRINCIPAL_ROLE:ALL")


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 43:
TOKEN=$(printf "%s" "$TOKEN_JSON" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')
                     ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
TOKEN=$(printf "%s" "${TOKEN_JSON}" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 45:
if [ -z "$TOKEN" ]; then
         ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ -z "${TOKEN}" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 46:
  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
                                                                      ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: ${TOKEN_JSON}" >&2


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 50:
echo "[polaris-init] Creating catalog '$CATALOG' with base '$BASE_LOCATION' ..."
                                       ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                            ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Creating catalog '${CATALOG}' with base '${BASE_LOCATION}' ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 53:
  "name": "$CATALOG",
           ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  "name": "${CATALOG}",


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 56:
    "default-base-location": "$BASE_LOCATION",
                              ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    "default-base-location": "${BASE_LOCATION}",


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 66:
    "allowedLocations": ["$BASE_LOCATION"]
                          ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    "allowedLocations": ["${BASE_LOCATION}"]


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 74:
  -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/management/v1/catalogs" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 75:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 77:
  -d "$CREATE_PAYLOAD")
      ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -d "${CREATE_PAYLOAD}")


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 79:
if [ "$HTTP_CODE" = "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" = "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 81:
elif [ "$HTTP_CODE" = "409" ]; then
        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
elif [ "${HTTP_CODE}" = "409" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 84:
  echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Create catalog failed (HTTP ${HTTP_CODE}):"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 89:
echo "[polaris-init] Setting up permissions for catalog '$CATALOG' ..."
                                                         ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Setting up permissions for catalog '${CATALOG}' ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 94:
  -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/catalogs/${CATALOG}/catalog-roles/catalog_admin/grants" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 95:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 99:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 100:
  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 107:
  -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/management/v1/principal-roles" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 108:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 112:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ] && [ "$HTTP_CODE" != "409" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ] && [ "${HTTP_CODE}" != "409" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 113:
  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
                                                                          ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 120:
  -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                           ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/principal-roles/data_engineer/catalog-roles/${CATALOG}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 121:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 125:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 126:
  echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
                                                              ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to connect roles (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 133:
  -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/principals/root/principal-roles" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 134:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 138:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 139:
  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
                                                                                  ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/ranger/ranger-admin/ranger-entrypoint.sh line 24:
cd $RANGER_HOME
   ^----------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.
   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cd "${RANGER_HOME}"


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 16:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 19:
if [ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
           ^------------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.

Did you mean: 
if [[ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]]; then


In docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh line 15:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/run-thirdparties-docker.sh line 131:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 165:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 350:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 359:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 364:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 383:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 385:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 387:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 398:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 400:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 410:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 411:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 413:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 424:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 426:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 493:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 494:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 495:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 504:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 506:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 508:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 512:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 527:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 528:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 529:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 540:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 541:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 546:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 551:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 557:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 595:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 597:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 609:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 616:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 656:
    . "${POLARIS_DIR}/polaris_settings.env"
      ^-- SC1091 (info): Not following: ./polaris_settings.env: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/run-thirdparties-docker.sh line 705:
if [[ "$NEED_LOAD_DATA" -eq 1 ]]; then
       ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${NEED_LOAD_DATA}" -eq 1 ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 711:
if [[ $need_prepare_hive_data -eq 1 ]]; then
      ^---------------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ ${need_prepare_hive_data} -eq 1 ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 832:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 833:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 834:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 836:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true


In run-be-ut.sh line 141:
    WITH_TDE_DIR        -- ${WITH_TDE_DIR}
                           ^-------------^ SC2154 (warning): WITH_TDE_DIR is referenced but not assigned.


In run-cloud-ut.sh line 190:
    -DENABLE_HDFS_STORAGE_VAULT=${ENABLE_HDFS_STORAGE_VAULT:-ON} \
                                ^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    -DENABLE_HDFS_STORAGE_VAULT="${ENABLE_HDFS_STORAGE_VAULT:-ON}" \


In thirdparty/build-thirdparty.sh line 1880:
    cp -r ${TP_SOURCE_DIR}/${JINDOFS_SOURCE}/* "${TP_INSTALL_DIR}/jindofs_libs/"
          ^--------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                           ^---------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    cp -r "${TP_SOURCE_DIR}"/"${JINDOFS_SOURCE}"/* "${TP_INSTALL_DIR}/jindofs_libs/"


In tools/lzo/build.sh line 1:
# Licensed to the Apache Software Foundation (ASF) under one
^-- SC2148 (error): Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.


In tools/lzo/build.sh line 20:
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I${DORIS_THIRDPARTY}/installed/include -L${DORIS_THIRDPARTY}/installed/lib -llzo2 -std=c++17
                                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                     ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I"${DORIS_THIRDPARTY}"/installed/include -L"${DORIS_THIRDPARTY}"/installed/lib -llzo2 -std=c++17

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC1128 -- The shebang must be on the first ...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- build-support/clang-format.sh.orig
+++ build-support/clang-format.sh
@@ -36,7 +36,7 @@
         echo "Error: Homebrew is missing. Please install it first due to we use Homebrew to manage the tools which are needed to build the project."
         exit 1
     fi
-    if ! brew list llvm@16 > /dev/null 2>&1; then
+    if ! brew list llvm@16 >/dev/null 2>&1; then
         echo "Error: Please install llvm@16 firt due to we use it to format code."
         exit 1
     fi
--- build.sh.orig
+++ build.sh
@@ -292,7 +292,7 @@
         BUILD_META_TOOL='ON'
         BUILD_FILE_CACHE_MICROBENCH_TOOL='OFF'
         BUILD_INDEX_TOOL='ON'
-	BUILD_TASK_EXECUTOR_SIMULATOR='OFF'
+        BUILD_TASK_EXECUTOR_SIMULATOR='OFF'
         BUILD_HIVE_UDF=1
         BUILD_BE_JAVA_EXTENSIONS=1
         CLEAN=0
@@ -516,7 +516,10 @@
 FEAT+=($([[ ${BUILD_HIVE_UDF} -eq 1 ]] && echo "+HIVE_UDF" || echo "-HIVE_UDF"))
 FEAT+=($([[ ${BUILD_BE_JAVA_EXTENSIONS} -eq 1 ]] && echo "+BE_JAVA_EXTENSIONS" || echo "-BE_JAVA_EXTENSIONS"))
 
-export DORIS_FEATURE_LIST=$(IFS=','; echo "${FEAT[*]}")
+export DORIS_FEATURE_LIST=$(
+    IFS=','
+    echo "${FEAT[*]}"
+)
 echo "Feature List: ${DORIS_FEATURE_LIST}"
 
 # Clean and build generated code
@@ -789,12 +792,12 @@
     mkdir -p "${DORIS_OUTPUT}/fe/plugins/java_extensions/"
 
     if [ "${TARGET_SYSTEM}" = "Darwin" ] || [ "${TARGET_SYSTEM}" = "Linux" ]; then
-      mkdir -p "${DORIS_OUTPUT}/fe/arthas"
-      rm -rf "${DORIS_OUTPUT}/fe/arthas/*"
-      unzip -o "${DORIS_OUTPUT}/fe/lib/arthas-packaging-*.jar" arthas-bin.zip -d "${DORIS_OUTPUT}/fe/arthas/"
-      unzip -o "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip" -d "${DORIS_OUTPUT}/fe/arthas/"
-      rm "${DORIS_OUTPUT}/fe/arthas/math-game.jar"
-      rm "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip"
+        mkdir -p "${DORIS_OUTPUT}/fe/arthas"
+        rm -rf "${DORIS_OUTPUT}/fe/arthas/*"
+        unzip -o "${DORIS_OUTPUT}/fe/lib/arthas-packaging-*.jar" arthas-bin.zip -d "${DORIS_OUTPUT}/fe/arthas/"
+        unzip -o "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip" -d "${DORIS_OUTPUT}/fe/arthas/"
+        rm "${DORIS_OUTPUT}/fe/arthas/math-game.jar"
+        rm "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip"
     fi
 fi
 
--- cloud/script/start.sh.orig
+++ cloud/script/start.sh
@@ -54,9 +54,9 @@
 fi
 # echo "$@" "daemonized=${daemonized}"}
 
-custom_start="${DORIS_HOME}/bin/custom_start.sh" 
+custom_start="${DORIS_HOME}/bin/custom_start.sh"
 if [[ -f "${custom_start}" ]]; then
-  source "${custom_start}" 
+    source "${custom_start}"
 fi
 enable_hdfs=${enable_hdfs:-1}
 process_name="${process_name:-doris_cloud}"
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/default/account_fund/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/account_fund/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/hive01/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/hive01/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/sale_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/sale_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/string_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/string_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/student/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/student/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test1/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test1/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test_hive_doris/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test_hive_doris/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type3/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type3/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter3/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter3/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_nested_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_nested_types/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_bigint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_bigint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_boolean/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_boolean/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_char/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_char/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_date/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_date/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_decimal/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_decimal/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_double/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_double/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_float/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_float/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_int/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_int/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_smallint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_smallint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_string/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_string/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_timestamp/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_timestamp/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_tinyint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_tinyint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_varchar/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_varchar/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_nested_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_nested_types/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_predicate_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_predicate_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_1/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_1/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_same_db_table_name/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_same_db_table_name/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_special_char_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_special_char_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_special_orc_formats/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_special_orc_formats/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/timestamp_with_time_zone/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/timestamp_with_time_zone/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_origin/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_origin/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/bigint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/bigint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/char_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/char_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/date_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/date_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/decimal_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/decimal_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/double_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/double_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/float_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/float_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/int_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/int_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/smallint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/smallint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/string_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/tinyint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/tinyint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/varchar_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/varchar_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -3,11 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/regression/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/regression/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/statistics/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/statistics/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/stats/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/stats/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/statistics/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/statistics/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/test/hive_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/test/hive_test/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/test/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/test/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -18,7 +18,6 @@
 
 set -e -x
 
-
 AUX_LIB="/mnt/scripts/auxlib"
 for file in "${AUX_LIB}"/*.tar.gz; do
     [ -e "$file" ] || continue
@@ -33,7 +32,6 @@
 # start metastore
 nohup /opt/hive/bin/hive --service metastore &
 
-
 # wait metastore start
 while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
     sleep 5s
@@ -73,7 +71,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -86,7 +83,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/paimon1 /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 ## put tvf_data
 if [[ -z "$(ls /mnt/scripts/tvf_data)" ]]; then
     echo "tvf_data does not exist"
@@ -99,7 +95,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 wait "${hadoop_put_pids[@]}"
 if [[ -z "$(hadoop fs -ls /user/doris/paimon1)" ]]; then
--- docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh
@@ -19,8 +19,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 # Extract all tar.gz files under the repo
-find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 | \
-xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
+find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 |
+    xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
   f="$0"
   echo "Extracting hive data $f"
   dir=$(dirname "$f")
@@ -145,4 +145,3 @@
 for jar in "${jars[@]}"; do
     curl -O "https://${s3BucketName}.${s3Endpoint}/regression/docker/hive3/${jar}"
 done
-
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -56,7 +56,6 @@
 curl -O https://s3BucketName.s3Endpoint/regression/docker/hive3/paimon-hive-connector-3.1-1.3-SNAPSHOT.jar
 curl -O https://s3BucketName.s3Endpoint/regression/docker/hive3/gcs-connector-hadoop3-2.2.24-shaded.jar
 
-
 /usr/local/hadoop-run.sh &
 
 # check healthy hear
@@ -86,7 +85,7 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 if [[ ${enablePaimonHms} == "true" ]]; then
     echo "Creating Paimon HMS catalog and table"
     hadoop fs -put /tmp/paimon_data/* /user/hive/warehouse/
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/docker-compose/polaris/init-catalog.sh.orig
+++ docker/thirdparties/docker-compose/polaris/init-catalog.sh
@@ -26,29 +26,30 @@
 
 echo "[polaris-init] Waiting for Polaris health check at http://$HOST:$PORT/q/health ..."
 for i in $(seq 1 120); do
-  if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
-    break
-  fi
-  sleep 2
+    if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
+        break
+    fi
+    sleep 2
 done
 
 echo "[polaris-init] Fetching OAuth token via client_credentials ..."
 # Try to obtain token using correct OAuth endpoint
 TOKEN_JSON=$(curl -sS \
-  -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
-  -H 'Content-Type: application/x-www-form-urlencoded' \
-  -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
+    -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
+    -H 'Content-Type: application/x-www-form-urlencoded' \
+    -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
 
 # Extract access_token field
 TOKEN=$(printf "%s" "$TOKEN_JSON" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')
 
 if [ -z "$TOKEN" ]; then
-  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
-  exit 1
+    echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
+    exit 1
 fi
 
 echo "[polaris-init] Creating catalog '$CATALOG' with base '$BASE_LOCATION' ..."
-CREATE_PAYLOAD=$(cat <<JSON
+CREATE_PAYLOAD=$(
+    cat <<JSON
 {
   "name": "$CATALOG",
   "type": "INTERNAL",
@@ -71,19 +72,19 @@
 
 # Try create; on 409 Conflict, treat as success
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d "$CREATE_PAYLOAD")
+    -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d "$CREATE_PAYLOAD")
 
 if [ "$HTTP_CODE" = "201" ]; then
-  echo "[polaris-init] Catalog created."
+    echo "[polaris-init] Catalog created."
 elif [ "$HTTP_CODE" = "409" ]; then
-  echo "[polaris-init] Catalog already exists. Skipping."
+    echo "[polaris-init] Catalog already exists. Skipping."
 else
-  echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
-  cat /tmp/resp.json || true
-  exit 1
+    echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
+    cat /tmp/resp.json || true
+    exit 1
 fi
 
 echo "[polaris-init] Setting up permissions for catalog '$CATALOG' ..."
@@ -91,55 +92,54 @@
 # Create a catalog admin role grants
 echo "[polaris-init] Creating catalog admin role grants ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"grant":{"type":"catalog", "privilege":"CATALOG_MANAGE_CONTENT"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"grant":{"type":"catalog", "privilege":"CATALOG_MANAGE_CONTENT"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Create a data engineer role
 echo "[polaris-init] Creating data engineer role ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"principalRole":{"name":"data_engineer"}}')
+    -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"principalRole":{"name":"data_engineer"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ] && [ "$HTTP_CODE" != "409" ]; then
-  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Connect the roles
 echo "[polaris-init] Connecting roles ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"catalogRole":{"name":"catalog_admin"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"catalogRole":{"name":"catalog_admin"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Give root the data engineer role
 echo "[polaris-init] Assigning data engineer role to root ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"principalRole": {"name":"data_engineer"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"principalRole": {"name":"data_engineer"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 echo "[polaris-init] Permissions setup completed."
 echo "[polaris-init] Done."
-
--- docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh
--- docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -51,7 +51,7 @@
 STOP=0
 NEED_RESERVE_PORTS=0
 export NEED_LOAD_DATA=1
-export LOAD_PARALLEL=$(( $(getconf _NPROCESSORS_ONLN) / 2 ))
+export LOAD_PARALLEL=$(($(getconf _NPROCESSORS_ONLN) / 2))
 
 if ! OPTS="$(getopt \
     -n "$0" \
@@ -205,7 +205,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -394,7 +394,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -413,14 +413,14 @@
         exit -1
     fi
     # before start it, you need to download parquet file package, see "README" in "docker-compose/hive/scripts/"
-    
+
     # generate hive-3x.yaml
     export IP_HOST=${IP_HOST}
     export CONTAINER_UID=${CONTAINER_UID}
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -446,12 +446,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data*.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data_paimon_101.zip \
-            && sudo unzip iceberg_data_paimon_101.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data_paimon_101.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data*.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data_paimon_101.zip &&
+                sudo unzip iceberg_data_paimon_101.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data_paimon_101.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -615,9 +615,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -683,12 +683,12 @@
     echo "RUN_ICEBERG_REST"
     # iceberg-rest with multiple cloud storage backends
     ICEBERG_REST_DIR=${ROOT}/docker-compose/iceberg-rest
-    
+
     # generate iceberg-rest.yaml
     export CONTAINER_UID=${CONTAINER_UID}
     . "${ROOT}"/docker-compose/iceberg-rest/iceberg-rest_settings.env
     envsubst <"${ICEBERG_REST_DIR}/docker-compose.yaml.tpl" >"${ICEBERG_REST_DIR}/docker-compose.yaml"
-    
+
     sudo docker compose -f "${ICEBERG_REST_DIR}/docker-compose.yaml" down
     if [[ "${STOP}" -ne 1 ]]; then
         # Start all three REST catalogs (S3, OSS, COS)
@@ -716,112 +716,112 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_iceberg.log 2>&1 &
+    start_iceberg >start_iceberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_ICEBERG_REST}" -eq 1 ]]; then
-    start_iceberg_rest > start_iceberg_rest.log 2>&1 &
+    start_iceberg_rest >start_iceberg_rest.log 2>&1 &
     pids["iceberg-rest"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_POLARIS}" -eq 1 ]]; then
-    start_polaris > start_polaris.log 2>&1 &
+    start_polaris >start_polaris.log 2>&1 &
     pids["polaris"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
 if [[ "${RUN_RANGER}" -eq 1 ]]; then
-    start_ranger > start_ranger.log 2>&1 &
+    start_ranger >start_ranger.log 2>&1 &
     pids["ranger"]=$!
 fi
 echo "waiting all dockers starting done"
--- run-regression-test.sh.orig
+++ run-regression-test.sh
@@ -220,12 +220,12 @@
         SKIP_NEXT=0
         continue
     fi
-    
+
     if [[ "${arg}" == "-f" ]] || [[ "${arg}" == "--file" ]]; then
         SKIP_NEXT=1
         continue
     fi
-    
+
     NEW_ARGS+=("${arg}")
 done
 
@@ -234,7 +234,7 @@
     # Extract directory (parent path)
     # e.g., "regression-test/suites/shape_check/tpch_sf1000/shape/q1.groovy" -> "regression-test/suites/shape_check/tpch_sf1000/shape"
     FILE_DIR=$(dirname "${FILE_PATH}")
-    
+
     # Extract suite name (filename without .groovy or .sql extension)
     # e.g., "q1.groovy" -> "q1" or "q01.sql" -> "q01"
     FILE_NAME=$(basename "${FILE_PATH}")
@@ -242,9 +242,9 @@
     SUITE_NAME="${FILE_NAME%.groovy}"
     # Remove .sql extension if exists
     SUITE_NAME="${SUITE_NAME%.sql}"
-    
+
     echo "Converted -f ${FILE_PATH} to -d ${FILE_DIR} -s ${SUITE_NAME}"
-    
+
     # Add -d and -s to arguments
     NEW_ARGS+=("-d" "${FILE_DIR}" "-s" "${SUITE_NAME}")
 fi
--- thirdparty/download-thirdparty.sh.orig
+++ thirdparty/download-thirdparty.sh
@@ -345,7 +345,7 @@
             patch -p1 <"${TP_PATCH_DIR}/rocksdb-5.14.2.patch"
             if [[ "$(uname -s)" == "Darwin" ]]; then
                 patch -p1 <"${TP_PATCH_DIR}/rocksdb-mac-compile-fix.patch"
-            fi 
+            fi
             touch "${PATCHED_MARK}"
         fi
         cd -
--- tools/coffeebench-tools/bin/run-queries.sh.orig
+++ tools/coffeebench-tools/bin/run-queries.sh
@@ -73,7 +73,6 @@
     usage
 fi
 
-
 check_prerequest() {
     local CMD=$1
     local NAME=$2
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@seawinde seawinde merged commit 5a4ada9 into gzmf_pr_based_master Oct 24, 2025
19 of 28 checks passed
seawinde added a commit that referenced this pull request Nov 10, 2025
…ntains window function (apache#55066)

### What problem does this PR solve?

Support window function rewrite when materialized view contains window
function

Such as mv def is as following:

        CREATE MATERIALIZED VIEW mv1
        BUILD IMMEDIATE REFRESH COMPLETE ON MANUAL
        DISTRIBUTED BY RANDOM BUCKETS 2
        PROPERTIES ('replication_num' = '1') 
        AS
        select *
        from (
        select 
        o_orderkey,
        FIRST_VALUE(o_custkey) OVER (
                PARTITION BY o_orderdate 
                ORDER BY o_totalprice NULLS LAST
                RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
            ) AS first_value,
        RANK() OVER (
                PARTITION BY o_orderdate, o_orderstatus 
                ORDER BY o_totalprice NULLS LAST
                RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
            ) AS rank_value,
        LAG(l_extendedprice, 1, 0) over (partition by o_orderdate, l_shipdate order by l_quantity) AS lag_value 
        from 
        lineitem2
        left join orders2 on l_orderkey = o_orderkey and l_shipdate = o_orderdate
        ) t
        where o_orderkey > 1;

if query as following, this can use mv to represent query

select *
            from (
            select 
            o_orderkey,
            FIRST_VALUE(o_custkey) OVER (
                    PARTITION BY o_orderdate 
                    ORDER BY o_totalprice NULLS LAST
                    RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
                ) AS first_value,
            RANK() OVER (
                    PARTITION BY o_orderdate, o_orderstatus 
                    ORDER BY o_totalprice NULLS LAST
                    RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW
                ) AS rank_value,
            LAG(l_extendedprice, 1, 0) over (partition by o_orderdate, l_shipdate order by l_quantity) AS lag_value 
            from 
            lineitem2
            left join orders2 on l_orderkey = o_orderkey and l_shipdate = o_orderdate
            ) t
            where o_orderkey > 2;

explain result is as follwing:

+----------------------------------------------------------------------------------------+
| Explain String(Nereids Planner)                                                        |
+----------------------------------------------------------------------------------------+
| PLAN FRAGMENT 0                                                                        |
|   OUTPUT EXPRS:                                                                        |
|     o_orderkey[apache#4]                                                                     |
|     first_value[apache#5]                                                                    |
|     rank_value[apache#6]                                                                     |
|     lag_value[apache#7]                                                                      |
|   PARTITION: RANDOM                                                                    |
|                                                                                        |
|   HAS_COLO_PLAN_NODE: false                                                            |
|                                                                                        |
|   VRESULT SINK                                                                         |
|      MYSQL_PROTOCAL                                                                    |
|                                                                                        |
|   0:VOlapScanNode(445)                                                                 |
|      TABLE: regression_test_nereids_rules_p0_mv_window.mv1(mv1), PREAGGREGATION: ON    |
|      PREDICATES: (o_orderkey[#0] > 2)                                                  |
|      partitions=1/1 (mv1)                                                              |
|      tablets=2/2, tabletList=1755678381149,1755678381151                               |
|      cardinality=1, avgRowSize=0.0, numNodes=1                                         |
|      pushAggOp=NONE                                                                    |
|      final projections: o_orderkey[#0], first_value[#1], rank_value[#2], lag_value[apache#3] |
|      final project output tuple id: 1                                                  |
|                                                                                        |
|                                                                                        |
| ========== MATERIALIZATIONS ==========                                                 |
|                                                                                        |
| MaterializedView                                                                       |
| MaterializedViewRewriteSuccessAndChose:                                                |
|   internal.regression_test_nereids_rules_p0_mv_window.mv1 chose                        |
|                                                                                        |
| MaterializedViewRewriteSuccessButNotChose:                                             |
|                                                                                        |
| MaterializedViewRewriteFail:                                                           |
|                                                                                        |
|                                                                                        |
| ========== STATISTICS ==========                                                       |
| planed with unknown column statistics                                                  |
+----------------------------------------------------------------------------------------+
seawinde pushed a commit that referenced this pull request Nov 17, 2025
…ich belongs to an agg materialized view (apache#58038)

### What problem does this PR solve?

Issue Number: close apache#58037

Problem Summary:

```
#0  0x00007f9aca4a3f8c in __pthread_kill_implementation () from /lib64/libc.so.6
#1  0x00007f9aca454a26 in raise () from /lib64/libc.so.6
#2  0x00007f9aca43d87c in abort () from /lib64/libc.so.6
apache#3  0x0000561dc3d1ea1d in ?? ()
apache#4  0x0000561dc3d1105a in google::LogMessage::Fail() ()
apache#5  0x0000561dc3d14146 in google::LogMessage::SendToLog() ()
apache#6  0x0000561dc3d10b90 in google::LogMessage::Flush() ()
apache#7  0x0000561dc3d14989 in google::LogMessageFatal::~LogMessageFatal() ()
apache#8  0x0000561db854c996 in assert_cast<doris::vectorized::ColumnStr<unsigned int> const&, (TypeCheckOnRelease)1, doris::vectorized::IColumn const&>(doris::vectorized::IColumn const&)::{lambda(auto:1&&)#1}::operator()<doris::vectorized::IColumn const&>(doris::vectorized::IColumn const&) const
    (this=this@entry=0x7f9658ccc1f8, from=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/common/assert_cast.h:58
apache#9  0x0000561db854c7d7 in assert_cast<doris::vectorized::ColumnStr<unsigned int> const&, (TypeCheckOnRelease)1, doris::vectorized::IColumn const&> (from=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/common/assert_cast.h:73
apache#10 0x0000561db854bb0b in doris::vectorized::ColumnStr<unsigned int>::compare_at (this=0x7f957a14e2c0, n=1159288, m=6, rhs_=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/columns/column_string.h:526
apache#11 0x0000561dbe108c6b in doris::vectorized::GenericComparisonImpl<doris::vectorized::EqualsOp<int, int> >::vector_constant (a=..., b=..., c=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:112
apache#12 doris::vectorized::FunctionComparison<doris::vectorized::EqualsOp, doris::vectorized::NameEquals>::execute_generic_identical_types (
    this=<optimized out>, block=..., result=result@entry=10, c0=0x7f957a14e2c0, c1=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:506
apache#13 0x0000561dbdf9e97e in doris::vectorized::FunctionComparison<doris::vectorized::EqualsOp, doris::vectorized::NameEquals>::execute_generic (
    this=0x7f96d6fb1b90, block=..., result=10, c0=..., c1=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:517
apache#14 doris::vectorized::FunctionComparison<doris::vectorized::EqualsOp, doris::vectorized::NameEquals>::execute_impl (this=0x7f96d6fb1b90, 
    context=<optimized out>, block=..., arguments=..., result=10, input_rows_count=104)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:707
apache#15 0x0000561dbdcf1b8f in doris::vectorized::DefaultExecutable::execute_impl (this=<optimized out>, context=0x6, block=..., arguments=..., 
    result=1, input_rows_count=104) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.h:472
apache#16 0x0000561dbeea76ae in doris::vectorized::PreparedFunctionImpl::_execute_skipped_constant_deal (this=this@entry=0x7f99f62a65d0, 
    context=context@entry=0x7f99f6442b00, block=..., args=..., result=result@entry=10, input_rows_count=104, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:121
apache#17 0x0000561dbeea4ce8 in doris::vectorized::PreparedFunctionImpl::execute_without_low_cardinality_columns (this=0x7f99f62a65d0, 
    context=0x7f99f6442b00, block=..., args=..., result=10, input_rows_count=104, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:246
apache#18 doris::vectorized::PreparedFunctionImpl::default_implementation_for_nulls (this=this@entry=0x7f99f62a65d0, 
    context=context@entry=0x7f99f6442b00, block=..., args=..., result=result@entry=10, input_rows_count=104, dry_run=<optimized out>, 
    executed=0x7f9658ccc666) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:218
apache#19 0x0000561dbeea4e9c in doris::vectorized::PreparedFunctionImpl::_execute_skipped_constant_deal (this=0x7f99f62a65d0, context=0x7f99f6442b00, 
    block=..., args=..., result=10, input_rows_count=<optimized out>, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:112
apache#20 doris::vectorized::PreparedFunctionImpl::execute_without_low_cardinality_columns (this=0x7f99f62a65d0, context=0x7f99f6442b00, block=..., 
    args=..., result=10, input_rows_count=104, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:246
apache#21 0x0000561dbeea4f66 in doris::vectorized::PreparedFunctionImpl::execute (this=0x11b078, context=0x6, block=..., args=..., result=1, 
    input_rows_count=104, dry_run=<optimized out>) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:252
apache#22 0x0000561dbdcf1500 in doris::vectorized::IFunctionBase::execute (this=<optimized out>, context=0x7f99f6442b00, block=..., arguments=..., 
    result=10, input_rows_count=104, dry_run=<optimized out>) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.h:195
--Type <RET> for more, q to quit, c to continue without paging--c
apache#23 0x0000561dbdceccad in doris::vectorized::VectorizedFnCall::_do_execute (this=0x7f96f0fec510, context=0x7f957f09cdf0, block=0x7f957b02a3b0, 
    result_column_id=0x7f9658ccca14, args=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vectorized_fn_call.cpp:197
apache#24 0x0000561dbdced2c6 in doris::vectorized::VectorizedFnCall::execute (this=0x11b078, context=0x6, 
    block=0x7f9aca4a3f8c <__pthread_kill_implementation+268>, result_column_id=0x7f9658ccbe10)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vectorized_fn_call.cpp:212
apache#25 0x0000561dbdd1e51b in doris::vectorized::VExprContext::execute (this=0x7f957f09cdf0, 
    block=0x7f9aca4a3f8c <__pthread_kill_implementation+268>, block@entry=0x7f957b02a3b0, result_column_id=result_column_id@entry=0x7f9658ccca14)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vexpr_context.cpp:55
apache#26 0x0000561dbdd1fcb5 in doris::vectorized::VExprContext::execute_conjuncts (ctxs=..., filters=filters@entry=0x0, accept_null=false, 
    block=block@entry=0x7f957b02a3b0, result_filter=result_filter@entry=0x7f9658ccccc0, can_filter_all=0x7f9658cccbc7)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vexpr_context.cpp:174
apache#27 0x0000561dbdd2131f in doris::vectorized::VExprContext::execute_conjuncts_and_filter_block (ctxs=..., block=0x7f957b02a3b0, 
    columns_to_filter=..., column_to_keep=6, filter=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vexpr_context.cpp:354
apache#28 0x0000561db8f49450 in doris::segment_v2::SegmentIterator::_execute_common_expr (this=this@entry=0x7f954e20a000, 
    sel_rowid_idx=0x7f955c0d2000, selected_size=@0x7f9658ccce6e: 104, block=block@entry=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:2338
apache#29 0x0000561db8f482f8 in doris::segment_v2::SegmentIterator::_next_batch_internal (this=0x7f954e20a000, block=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:2230
apache#30 0x0000561db8f45212 in doris::segment_v2::SegmentIterator::next_batch(doris::vectorized::Block*)::$_0::operator()() const (
    this=<optimized out>) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:1953
apache#31 doris::segment_v2::SegmentIterator::next_batch (this=0x7f954e20a000, block=0x6)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:1952
apache#32 0x0000561db8ee49bc in doris::segment_v2::LazyInitSegmentIterator::next_batch (this=0x7f953bc71f80, block=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/lazy_init_segment_iterator.h:44
apache#33 0x0000561db8dab844 in doris::BetaRowsetReader::next_block (this=0x7f9a4e215800, block=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/beta_rowset_reader.cpp:377
apache#34 0x0000561dc2c9413d in doris::vectorized::VCollectIterator::Level0Iterator::_refresh (this=0x7f953ba137a0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.h:256
apache#35 doris::vectorized::VCollectIterator::Level0Iterator::refresh_current_row (this=this@entry=0x7f953ba137a0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.cpp:509
apache#36 0x0000561dc2c93bf4 in doris::vectorized::VCollectIterator::Level0Iterator::init (this=0x7f953ba137a0, get_data_by_ref=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.cpp:461
apache#37 0x0000561dc2c91002 in doris::vectorized::VCollectIterator::build_heap (this=0x7f957a52bb30, rs_readers=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.cpp:125
apache#38 0x0000561dc2c7e1f2 in doris::vectorized::BlockReader::_init_collect_iter (this=this@entry=0x7f957a52b400, read_params=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/block_reader.cpp:153
apache#39 0x0000561dc2c7f191 in doris::vectorized::BlockReader::init (this=<optimized out>, read_params=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/block_reader.cpp:226
apache#40 0x0000561dc3937869 in doris::vectorized::NewOlapScanner::open (this=0x7f9a56270210, state=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/new_olap_scanner.cpp:252
apache#41 0x0000561dbdcc5413 in doris::vectorized::ScannerScheduler::_scanner_scan (ctx=..., scan_task=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:221
apache#42 0x0000561dbdcc62bd in doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}::operator()() const::{lambda()#1}::operator()() const (this=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:154
apache#43 doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}::operator()() const (this=0x7f954e25d3c0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:153
apache#44 std::__invoke_impl<void, doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&>(std::__invoke_other, doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&) (__f=...)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/invoke.h:61
apache#45 std::__invoke_r<void, doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&>(doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&) (__fn=...)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/invoke.h:111
apache#46 std::_Function_handler<void (), doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}>::_M_invoke(std::_Any_data const&) (__functor=...)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:291
apache#47 0x0000561db962137a in doris::ThreadPool::dispatch_thread (this=0x7f9a50f9d380)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/util/threadpool.cpp:602
apache#48 0x0000561db96159a1 in std::function<void ()>::operator()() const (this=0x11a791)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:560
apache#49 doris::Thread::supervise_thread (arg=0x7f969f569ce0) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/util/thread.cpp:498
apache#50 0x00007f9aca4a2215 in start_thread () from /lib64/libc.so.6
apache#51 0x00007f9aca524bdc in clone3 () from /lib64/libc.so.6
```

Assume that 0,1,2,3,4, is key columns of an AGG mv, because the PreAgg
is OFF at scan node, the block will contain all key columns to merge
data in storage layer.

if we select 0,1 column, with 3,4 column in where clause, then the slot
ids should be 0,1,3,4, and column ids in conjuncts is the index of slot
ids.(which is 2 and 3)

But the plan use the key type of base table which is DUP key, treating
the AGG mv as a DUP mv, so these conjuncts are pushed down to the scan
node which belongs to an AGG mv, these conjuncts will pick the wrong
column 2 and 3 (which shoud be 4 and 5) in block to exucute.

So we should use the key type of mv but not the key type of base table.

### Release note

None

### Check List (For Author)

- Test <!-- At least one of them must be included. -->
    - [x] Regression test
    - [ ] Unit Test
    - [ ] Manual test (add detailed scripts or steps below)
    - [ ] No need to test or manual test. Explain why:
- [ ] This is a refactor/code format and no logic has been changed.
        - [ ] Previous test can cover this change.
        - [ ] No code files have been changed.
        - [ ] Other reason <!-- Add your reason?  -->

- Behavior changed:
    - [x] No.
    - [ ] Yes. <!-- Explain the behavior change -->

- Does this need documentation?
    - [x] No.
- [ ] Yes. <!-- Add document PR link here. eg:
apache/doris-website#1214 -->

### Check List (For Reviewer who merge this PR)

- [ ] Confirm the release note
- [ ] Confirm test cases
- [ ] Confirm document
- [ ] Add branch pick label <!-- Add branch pick label that this PR
should merge into -->
seawinde pushed a commit that referenced this pull request Nov 17, 2025
…57888)

We previously had a crash. The cause is that we should not access the
request after calling add_block(...) because add_block may enqueue a
closure that runs on another thread and frees the request

```
==730145==ERROR: AddressSanitizer: heap-buffer-overflow on address 0x7be1efd803a0 at pc 0x556b38d9d625 bp 0x7b16bf0193f0 sp 0x7b16bf0193e8
READ of size 4 at 0x7be1efd803a0 thread T1559
    #0 0x556b38d9d624 in google::protobuf::internal::RepeatedPtrFieldBase::size() const /home/zcp/repo_center/doris_master/doris/thirdparty/installed/include/google/protobuf/repeated_ptr_field.h:185:29
    #1 0x556b408ab062 in google::protobuf::RepeatedPtrField<doris::PBlock>::size() const /home/zcp/repo_center/doris_master/doris/thirdparty/installed/include/google/protobuf/repeated_ptr_field.h:1248:32
    #2 0x556b408aaff4 in doris::PTransmitDataParams::_internal_blocks_size() const /home/zcp/repo_center/doris_master/doris/be/../gensrc/build/gen_cpp/internal_service.pb.h:32149:25
    apache#3 0x556b4089731c in doris::PTransmitDataParams::blocks_size() const /home/zcp/repo_center/doris_master/doris/be/../gensrc/build/gen_cpp/internal_service.pb.h:32152:10
    apache#4 0x556b60a83c17 in doris::vectorized::VDataStreamMgr::transmit_block(doris::PTransmitDataParams const*, google::protobuf::Closure**, long) /home/zcp/repo_center/doris_master/doris/be/src/vec/runtime/vdata_stream_mgr.cpp:150:38
    apache#5 0x556b407f7408 in doris::PInternalService::_transmit_block(google::protobuf::RpcController*, doris::PTransmitDataParams const*, doris::PTransmitDataResult*, google::protobuf::Closure*, doris::Status const&, long) /home/zcp/repo_center/doris_master/doris/be/src/service/internal_service.cpp:1673:40
    apache#6 0x556b407f52bb in doris::PInternalService::transmit_block(google::protobuf::RpcController*, doris::PTransmitDataParams const*, doris::PTransmitDataResult*, google::protobuf::Closure*) /home/zcp/repo_center/doris_master/doris/be/src/service/internal_service.cpp:1610:9
    apache#7 0x556b43fceba2 in doris::PBackendService::CallMethod(google::protobuf::MethodDescriptor const*, google::protobuf::RpcController*, google::protobuf::Message const*, google::protobuf::Message*, google::protobuf::Closure*) /home/zcp/repo_center/doris_master/doris/gensrc/build/gen_cpp/internal_service.pb.cc:49452:7
    apache#8 0x556b6736273e in brpc::policy::ProcessRpcRequest(brpc::InputMessageBase*) (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x770bd73e)
    apache#9 0x556b67357426 in brpc::ProcessInputMessage(void*) (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x770b2426)
    apache#10 0x556b67357f20 in brpc::InputMessenger::InputMessageClosure::~InputMessageClosure() (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x770b2f20)
    apache#11 0x556b673588dd in brpc::InputMessenger::OnNewMessages(brpc::Socket*) (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x770b38dd)
    apache#12 0x556b674a0adc in brpc::Socket::ProcessEvent(void*) (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x771fbadc)
    apache#13 0x556b672e0f76 in bthread::TaskGroup::task_runner(long) (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x7703bf76)
    apache#14 0x556b672cbbe0 in bthread_make_fcontext (/mnt/hdd01/selectdb-cloud-chaos/cluster0/be/lib/doris_be+0x77026be0)
```

```
        Note: The done pointer will be saved in add_block and may be called in another thread via done->Run().
        For example, when blocks_size == 1, the process is as follows:
        transmit_block (i=0)
          └─> recvr->add_block(..., done, ...)  // Pass done
               └─> SenderQueue::add_block
                    └─> _pending_closures.push(done)  // done is saved

        get_batch() [another thread]
          └─> closure_pair.first->Run()  // ⚠️ done->Run() is called
               └─> brpc releases request and response

        transmit_block (i=1)  [original thread continues]
          └─> request->blocks_size()  // ⚠️ request has already been released!

        At this point, a use-after-free issue occurs.

        TODO: We should consider refactoring this part because add_block may release the request.
        We should not access the request after calling add_block.
```


apache#50113
seawinde pushed a commit that referenced this pull request Dec 3, 2025
Related issue: apache#57884

```
MySQL [demo]> show frontends;
+-----------------------------------------+--------------+-------------+----------+-----------+---------+--------------------+----------+----------+-----------+------+-------+-------------------+---------------------+---------------------+----------+--------+------------------------+------------------+---------------------+
| Name                                    | Host         | EditLogPort | HttpPort | QueryPort | RpcPort | ArrowFlightSqlPort | Role     | IsMaster | ClusterId | Join | Alive | ReplayedJournalId | LastStartTime       | LastHeartbeat       | IsHelper | ErrMsg | Version                | CurrentConnected | LiveSince           |
+-----------------------------------------+--------------+-------------+----------+-----------+---------+--------------------+----------+----------+-----------+------+-------+-------------------+---------------------+---------------------+----------+--------+------------------------+------------------+---------------------+
| fe_a7c0b6d8_82c2_48f0_8220_fb65dd18be69 | 10.37.75.124 | 9010        | 8030     | 9030      | 9020    | 8070               | FOLLOWER | true     | 742250121 | true | true  | 2409              | 2025-11-11 14:42:16 | 2025-11-11 14:44:06 | true     |        | doris-0.0.0-009c3b552a | Yes              | 2025-11-11 14:42:16 |
+-----------------------------------------+--------------+-------------+----------+-----------+---------+--------------------+----------+----------+-----------+------+-------+-------------------+---------------------+---------------------+----------+--------+------------------------+------------------+---------------------+
1 row in set (0.016 sec)

MySQL [demo]> show catalog edoris;
+-----------------+-------------------------------+
| Key             | Value                         |
+-----------------+-------------------------------+
| create_time     | 2025-11-11 11:25:33.488106853 |
| fe_arrow_hosts  | 10.37.103.28:8070             |
| fe_http_hosts   | 10.37.103.28:8030             |
| fe_thrift_hosts | 10.37.103.28:9020             |
| password        | *XXX                          |
| type            | doris                         |
| use_meta_cache  | true                          |
| user            | test                          |
+-----------------+-------------------------------+
8 rows in set (0.002 sec)

MySQL [demo]> select * from inner_table;
+----------+--------+
| log_type | reason |
+----------+--------+
|        2 | inner2 |
|        3 | inner3 |
|        4 | inner4 |
+----------+--------+
3 rows in set (0.032 sec)

MySQL [demo]> select * from edoris.external.example_tbl_duplicate;
+---------------------+----------+------------+-----------+-------+---------------------+
| log_time            | log_type | error_code | error_msg | op_id | op_time             |
+---------------------+----------+------------+-----------+-------+---------------------+
| 2024-11-01 00:00:00 |        2 |          2 | timeout   |    12 | 2024-11-01 01:00:00 |
+---------------------+----------+------------+-----------+-------+---------------------+
1 row in set (0.059 sec)

MySQL [demo]> select * from inner_table a join edoris.external.example_tbl_duplicate b on (a.log_type = b.log_type);
+----------+--------+---------------------+----------+------------+-----------+-------+---------------------+
| log_type | reason | log_time            | log_type | error_code | error_msg | op_id | op_time             |
+----------+--------+---------------------+----------+------------+-----------+-------+---------------------+
|        2 | inner2 | 2024-11-01 00:00:00 |        2 |          2 | timeout   |    12 | 2024-11-01 01:00:00 |
+----------+--------+---------------------+----------+------------+-----------+-------+---------------------+
1 row in set (0.050 sec)

MySQL [demo]> explain select * from inner_table a join edoris.external.example_tbl_duplicate b on (a.log_type = b.log_type) where error_code=2;
+-------------------------------------------------------------------------------------------------------------------------------------------+
| Explain String(Nereids Planner)                                                                                                           |
+-------------------------------------------------------------------------------------------------------------------------------------------+
| PLAN FRAGMENT 0                                                                                                                           |
|   OUTPUT EXPRS:                                                                                                                           |
|     log_type[apache#16]                                                                                                                         |
|     reason[apache#17]                                                                                                                           |
|     log_time[apache#18]                                                                                                                         |
|     log_type[apache#19]                                                                                                                         |
|     error_code[apache#20]                                                                                                                       |
|     error_msg[apache#21]                                                                                                                        |
|     op_id[apache#22]                                                                                                                            |
|     op_time[apache#23]                                                                                                                          |
|   PARTITION: HASH_PARTITIONED: log_type[apache#6]                                                                                               |
|                                                                                                                                           |
|   HAS_COLO_PLAN_NODE: false                                                                                                               |
|                                                                                                                                           |
|   VRESULT SINK                                                                                                                            |
|      MYSQL_PROTOCOL                                                                                                                       |
|                                                                                                                                           |
|   3:VHASH JOIN(200)                                                                                                                       |
|   |  join op: INNER JOIN(BROADCAST)[]                                                                                                     |
|   |  equal join conjunct: (log_type[apache#6] = log_type[#1])                                                                                   |
|   |  cardinality=3                                                                                                                        |
|   |  vec output tuple id: 3                                                                                                               |
|   |  output tuple id: 3                                                                                                                   |
|   |  vIntermediate tuple ids: 2                                                                                                           |
|   |  hash output slot ids: 0 1 2 3 4 5 6 7                                                                                                |
|   |  runtime filters: RF000[min_max] <- log_type[#1](1/1/1048576), RF001[in_or_bloom] <- log_type[#1](1/1/1048576)                        |
|   |  final projections: log_type[apache#8], reason[apache#9], log_time[apache#10], log_type[apache#11], error_code[apache#12], error_msg[apache#13], op_id[apache#14], op_time[apache#15] |
|   |  final project output tuple id: 3                                                                                                     |
|   |  distribute expr lists: log_type[apache#6]                                                                                                  |
|   |  distribute expr lists:                                                                                                               |
|   |                                                                                                                                       |
|   |----1:VEXCHANGE                                                                                                                        |
|   |       offset: 0                                                                                                                       |
|   |       distribute expr lists: log_type[#1]                                                                                             |
|   |                                                                                                                                       |
|   2:VOlapScanNode(187)                                                                                                                    |
|      TABLE: demo.inner_table(inner_table), PREAGGREGATION: ON                                                                             |
|      partitions=1/1 (inner_table)                                                                                                         |
|      tablets=1/1, tabletList=1762832514491                                                                                                |
|      cardinality=3, avgRowSize=901.6666, numNodes=1                                                                                       |
|      pushAggOp=NONE                                                                                                                       |
|      runtime filters: RF000[min_max] -> log_type[apache#6], RF001[in_or_bloom] -> log_type[apache#6]                                                  |
|                                                                                                                                           |
| PLAN FRAGMENT 1                                                                                                                           |
|                                                                                                                                           |
|   PARTITION: HASH_PARTITIONED: log_type[#1]                                                                                               |
|                                                                                                                                           |
|   HAS_COLO_PLAN_NODE: false                                                                                                               |
|                                                                                                                                           |
|   STREAM DATA SINK                                                                                                                        |
|     EXCHANGE ID: 01                                                                                                                       |
|     UNPARTITIONED                                                                                                                         |
|                                                                                                                                           |
|   0:VOlapScanNode(188)                                                                                                                    |
|      TABLE: external.example_tbl_duplicate(example_tbl_duplicate), PREAGGREGATION: ON                                                     |
|      PREDICATES: (error_code[#2] = 2)                                                                                                     |
|      partitions=1/1 (example_tbl_duplicate)                                                                                               |
|      tablets=1/1, tabletList=1762481736238                                                                                                |
|      cardinality=1, avgRowSize=7425.0, numNodes=1                                                                                         |
|      pushAggOp=NONE                                                                                                                       |
|                                                                                                                                           |
|                                                                                                                                           |
|                                                                                                                                           |
| ========== STATISTICS ==========                                                                                                          |
| planed with unknown column statistics                                                                                                     |
+-------------------------------------------------------------------------------------------------------------------------------------------+
65 rows in set (0.040 sec)

```
seawinde pushed a commit that referenced this pull request Dec 7, 2025
…shard_ptr not initialized (apache#58751)

```
#0  0x000055f11470ff29 in std::__shared_ptr<doris::TabletMeta, (__gnu_cxx::_Lock_policy)2>::get (this=0x148) at /home/work/env/ldb_toolchain_master/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/shared_ptr_base.h:1673

#1  std::__shared_ptr_access<doris::TabletMeta, (__gnu_cxx::_Lock_policy)2, false, false>::_M_get (this=0x148) at /home/work/env/ldb_toolchain_master/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/shared_ptr_base.h:1370

#2  std::__shared_ptr_access<doris::TabletMeta, (__gnu_cxx::_Lock_policy)2, false, false>::operator-> (this=0x148) at /home/work/env/ldb_toolchain_master/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/shared_ptr_base.h:1364

apache#3  doris::BaseTablet::tablet_id (this=0x0) at /home/work/doris/be/src/olap/base_tablet.h:73

apache#4  doris::segment_v2::SegmentWriter::finalize (this=0x7fa160bde000, segment_file_size=0x7fa564dcc668, index_size=0x7fa564dcc5b8) at /home/work/doris/be/src/olap/rowset/segment_v2/segment_writer.cpp:1044

apache#5  0x000055f11442578c in doris::SegmentFlusher::_flush_segment_writer (this=0x7fa160b8a420, writer=..., flush_size=flush_size@entry=0x0) at /home/work/doris/be/src/olap/rowset/segment_creator.cpp:304

apache#6  0x000055f114426c4e in doris::SegmentFlusher::Writer::flush (this=<optimized out>) at /home/work/doris/be/src/olap/rowset/segment_creator.cpp:376

apache#7  doris::SegmentCreator::flush (this=0x7fa160b8a418) at /home/work/doris/be/src/olap/rowset/segment_creator.cpp:422

apache#8  0x000055f1143d4785 in doris::BaseBetaRowsetWriter::flush (this=<optimized out>) at /home/work/doris/be/src/olap/rowset/beta_rowset_writer.cpp:723

apache#9  0x000055f11436bf47 in doris::Merger::vmerge_rowsets (tablet=..., reader_type=<optimized out>, cur_tablet_schema=..., src_rowset_readers=..., dst_rowset_writer=0x7fa160b8a000, stats_output=0x7fa1df108298) at /home/work/doris/be/src/olap/merger.cpp:159

apache#10 0x000055f114345595 in doris::Compaction::merge_input_rowsets (this=this@entry=0x7fa1df108210) at /home/work/doris/be/src/olap/compaction.cpp:220

apache#11 0x000055f1143550d2 in doris::CloudCompactionMixin::execute_compact_impl (this=this@entry=0x7fa1df108210, permits=permits@entry=6) at /home/work/doris/be/src/olap/compaction.cpp:1491

apache#12 0x000055f114342bc1 in doris::CloudCompactionMixin::execute_compact (this=0x7fa1df108210) at /home/work/doris/be/src/olap/compaction.cpp:1620

apache#13 0x000055f11a22c6d8 in doris::CloudCumulativeCompaction::execute_compact (this=0x7fa1df108210) at /home/work/doris/be/src/cloud/cloud_cumulative_compaction.cpp:203
```
seawinde pushed a commit that referenced this pull request Dec 18, 2025
…e#59098)

### What problem does this PR solve?

Introduced by apache#58905

==2037076==ERROR: AddressSanitizer: heap-use-after-free on address
0x7baaae908730 at pc 0x561b769a1fd0 bp 0x7b3caf4ebdf0 sp 0x7b3caf4ebde8
22:30:08  READ of size 1 at 0x7baaae908730 thread T12303 (rs_normal
[work)
22:30:08  #0 0x561b769a1fcf in doris::(anonymous
namespace)::string_compare(char const*, long, char const*, long, long)
/root/doris/be/src/vec/common/string_ref.h:170:29
22:30:08  #1 0x561b769a1fcf in
doris::StringRef::compare(doris::StringRef const&) const
/root/doris/be/src/vec/common/string_ref.h:259:30
22:30:08  #2 0x561b76f537cd in doris::StringRef::ge(doris::StringRef
const&) const /root/doris/be/src/vec/common/string_ref.h:282:52
22:30:08  apache#3 0x561b76f537cd in
doris::StringRef::operator>=(doris::StringRef const&) const
/root/doris/be/src/vec/common/string_ref.h:292:60
22:30:08  apache#4 0x561b76f537cd in bool
doris::Compare::greater_equal<doris::StringRef>(doris::StringRef const&,
doris::StringRef const&) /root/doris/be/src/common/compare.h:42:18
22:30:08  apache#5 0x561b76f537cd in
doris::ComparisonPredicateBase<(doris::PrimitiveType)23,
(doris::PredicateType)6>::camp_field(doris::vectorized::Field const&,
doris::vectorized::Field const&) const
/root/doris/be/src/olap/comparison_predicate.h:192:20
22:30:08  apache#6 0x561b76f4baa4 in
doris::ComparisonPredicateBase<(doris::PrimitiveType)23,
(doris::PredicateType)6>::evaluate_and(doris::vectorized::ParquetPredicate::ColumnStat*)
const /root/doris/be/src/olap/comparison_predicate.h:207:26
22:30:08  apache#7 0x561b76765284 in
doris::AndBlockColumnPredicate::evaluate_and(doris::vectorized::ParquetPredicate::ColumnStat*)
const /root/doris/be/src/olap/block_column_predicate.h:251:42
22:30:08  apache#8 0x561b89acd735 in
doris::vectorized::ParquetReader::_process_column_stat_filter(tparquet::RowGroup
const&, std::vector<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> >,
std::allocator<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> > > > const&,
bool*, bool*, bool*)
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:1225:25
22:30:08  apache#9 0x561b89ac8dd7 in
doris::vectorized::ParquetReader::_process_min_max_bloom_filter(doris::vectorized::RowGroupReader::RowGroupIndex
const&, tparquet::RowGroup const&,
std::vector<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> >,
std::allocator<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> > > > const&,
doris::segment_v2::RowRanges*)
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:1108:9
22:30:08  apache#10 0x561b89ac3e73 in
doris::vectorized::ParquetReader::_next_row_group_reader()
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:718:9
22:30:08  apache#11 0x561b89ac008f in
doris::vectorized::ParquetReader::get_next_block(doris::vectorized::Block*,
unsigned long*, bool*)
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:607:21
22:30:08  apache#12 0x561b8a07c6f7 in
doris::vectorized::HiveReader::get_next_block_inner(doris::vectorized::Block*,
unsigned long*, bool*)
/root/doris/be/src/vec/exec/format/table/hive_reader.cpp:32:5
22:30:08  apache#13 0x561b89fee256 in
doris::vectorized::TableFormatReader::get_next_block(doris::vectorized::Block*,
unsigned long*, bool*)
/root/doris/be/src/vec/exec/format/table/table_format_reader.h:81:16
22:30:08  apache#14 0x561b89f71b97 in
doris::vectorized::FileScanner::_get_block_wrapped(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/file_scanner.cpp:472:13
22:30:08  apache#15 0x561b89f7086f in
doris::vectorized::FileScanner::_get_block_impl(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/file_scanner.cpp:409:17
22:30:08  apache#16 0x561b8a19f86e in
doris::vectorized::Scanner::get_block(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/scanner.cpp:109:17
22:30:08  apache#17 0x561b8a19f0a6 in
doris::vectorized::Scanner::get_block_after_projects(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/scanner.cpp:85:16
22:30:08  apache#18 0x561b8a1ccd0f in
doris::vectorized::ScannerScheduler::_scanner_scan(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:173:5
22:30:08  apache#19 0x561b8a1d6875 in
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()::operator()() const::'lambda'()::operator()() const
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:76:17
22:30:08  apache#20 0x561b8a1d6875 in
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()::operator()() const
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:75:27
22:30:08  apache#21 0x561b8a1d6875 in bool std::__invoke_impl<bool,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&>(std::__invoke_other,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&)
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/invoke.h:63:14
22:30:08  apache#22 0x561b8a1d6875 in std::enable_if<is_invocable_r_v<bool,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&>, bool>::type std::__invoke_r<bool,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&>(doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&)
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/invoke.h:116:9
22:30:08  apache#23 0x561b8a1d6875 in std::_Function_handler<bool (),
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()>::_M_invoke(std::_Any_data const&)
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:292:9
22:30:08  apache#24 0x561b8a1d5f07 in std::function<bool ()>::operator()()
const
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:593:9
22:30:08  apache#25 0x561b8a1d5f07 in
doris::vectorized::ScannerSplitRunner::process_for(std::chrono::duration<long,
std::ratio<1l, 1000000000l> >)
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:407:25
22:30:08  apache#26 0x561b8a2c56d4 in
doris::vectorized::PrioritizedSplitRunner::process()
/root/doris/be/src/vec/exec/executor/time_sharing/prioritized_split_runner.cpp:103:35
22:30:08  apache#27 0x561b8a29045c in
doris::vectorized::TimeSharingTaskExecutor::_dispatch_thread()
/root/doris/be/src/vec/exec/executor/time_sharing/time_sharing_task_executor.cpp:570:77
22:30:08  apache#28 0x561b7b9fecb6 in std::function<void ()>::operator()()
const
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:593:9
22:30:08  apache#29 0x561b7b9fecb6 in doris::Thread::supervise_thread(void*)
/root/doris/be/src/util/thread.cpp:460:5
22:30:08  apache#30 0x561b76044d26 in asan_thread_start(void*)
(/mnt/ssd01/pipline/OpenSourceDoris/clusterEnv/P1/Cluster0/be/lib/doris_be+0x23962d26)
22:30:08  apache#31 0x7f4aaae68608 in start_thread
/build/glibc-SzIz7B/glibc-2.31/nptl/pthread_create.c:477:8
22:30:08  apache#32 0x7f4aaad7b132 in __clone
/build/glibc-SzIz7B/glibc-2.31/misc/../sysdeps/unix/sysv/linux/x86_64/clone.S:95
seawinde pushed a commit that referenced this pull request Jan 22, 2026
…pache#59920)

### What problem does this PR solve?
When `BaseBetaRowsetWriter` is destructed(this may happen when the load
is canceled) before the execution of the task submitted to thread pool,
the task may encounter coredump due to use after free.
```
(gdb) bt
#0  __GI___pthread_sigmask (how=2, newmask=<optimized out>, oldmask=0x0) at ./nptl/pthread_sigmask.c:43
#1  0x00007fa1d0c1171e in PosixSignals::chained_handler(int, siginfo*, void*) [clone .part.0] () from /usr/lib/jvm/java-17-openjdk-amd64/lib/server/libjvm.so
#2  0x00007fa1d0c12206 in JVM_handle_linux_signal () from /usr/lib/jvm/java-17-openjdk-amd64/lib/server/libjvm.so
apache#3  <signal handler called>
apache#4  doris::TUniqueId::TUniqueId (this=0x7f99955f2208, other51=...) at /home/zcp/repo_center/doris_branch-4.0/doris/gensrc/build/gen_cpp/Types_types.cpp:2571
apache#5  0x00005653d14008ca in doris::AttachTask::init (rc=..., this=<optimized out>) at /home/zcp/repo_center/doris_branch-4.0/doris/be/src/runtime/thread_context.cpp:29
apache#6  doris::AttachTask::AttachTask (this=<optimized out>, rc=...) at /home/zcp/repo_center/doris_branch-4.0/doris/be/src/runtime/thread_context.cpp:34
apache#7  0x00005653d0d05087 in doris::CalcDeleteBitmapToken::submit_func<doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0>(doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0&&)::{lambda()#1}::operator()() const (this=0x7f9cdf302500)
    at /home/zcp/repo_center/doris_branch-4.0/doris/be/src/olap/calc_delete_bitmap_executor.h:74
apache#8  std::__invoke_impl<void, doris::CalcDeleteBitmapToken::submit_func<doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0>(doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0&&)::{lambda()#1}&>(std::__invoke_other, doris::CalcDeleteBitmapToken::submit_func<doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0>(doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0&&)::{lambda()#1}&) (__f=...)
    at /usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/invoke.h:63
apache#9  std::__invoke_r<void, doris::CalcDeleteBitmapToken::submit_func<doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0>(doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0&&)::{lambda()#1}&>(doris::CalcDeleteBitmapToken::submit_func<doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0>(doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0&&)::{lambda()#1}&) (__fn=...) at /usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/invoke.h:113
apache#10 std::_Function_handler<void (), doris::CalcDeleteBitmapToken::submit_func<doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0>(doris::BaseBetaRowsetWriter::_generate_delete_bitmap(int)::$_0&&)::{lambda()#1}>::_M_invoke(std::_Any_data const&) (
    __functor=...) at /usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:292
apache#11 0x00005653d16392e5 in doris::ThreadPool::dispatch_thread (this=0x7fa120d9af00) at /home/zcp/repo_center/doris_branch-4.0/doris/be/src/util/threadpool.cpp:616
apache#12 0x00005653d162e38c in std::function<void ()>::operator()() const (this=0x7f99955f2208) at /usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:593
apache#13 doris::Thread::supervise_thread (arg=0x7fa0c0049110) at /home/zcp/repo_center/doris_branch-4.0/doris/be/src/util/thread.cpp:460
apache#14 0x00007fa1cfcacac3 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
apache#15 0x00007fa1cfd3e850 in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
```

### Release note

None

### Check List (For Author)

- Test <!-- At least one of them must be included. -->
    - [ ] Regression test
    - [ ] Unit Test
    - [ ] Manual test (add detailed scripts or steps below)
    - [ ] No need to test or manual test. Explain why:
- [ ] This is a refactor/code format and no logic has been changed.
        - [ ] Previous test can cover this change.
        - [ ] No code files have been changed.
        - [ ] Other reason <!-- Add your reason?  -->

- Behavior changed:
    - [ ] No.
    - [ ] Yes. <!-- Explain the behavior change -->

- Does this need documentation?
    - [ ] No.
- [ ] Yes. <!-- Add document PR link here. eg:
apache/doris-website#1214 -->

### Check List (For Reviewer who merge this PR)

- [ ] Confirm the release note
- [ ] Confirm test cases
- [ ] Confirm document
- [ ] Add branch pick label <!-- Add branch pick label that this PR
should merge into -->
seawinde pushed a commit that referenced this pull request Jan 30, 2026
…che#5027)

pick apache#56854
```
start BE in cloud mode, cloud_unique_id: ddc_cloud_unique_id_be, meta_service_endpoint: 172.20.56.123:5000
/mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/vec/sink/writer/vfile_result_writer.cpp:139:71: runtime error: load of value 190, which is not a valid value for type 'bool'
    #0 0x55b5e9a1e7a0 in doris::vectorized::VFileResultWriter::_create_file_writer(std::__cxx11::basic_string, std::allocator> const&) /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/vec/sink/writer/vfile_result_writer.cpp:139:71
    #1 0x55b5e9a1ac9f in doris::vectorized::VFileResultWriter::_create_next_file_writer() /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/vec/sink/writer/vfile_result_writer.cpp:116:12
    #2 0x55b5e9a16445 in doris::vectorized::VFileResultWriter::open(doris::RuntimeState*, doris::RuntimeProfile*) /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/vec/sink/writer/vfile_result_writer.cpp:100:12
    apache#3 0x55b5e7677e7c in doris::vectorized::AsyncResultWriter::process_block(doris::RuntimeState*, doris::RuntimeProfile*) /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/vec/sink/writer/async_result_writer.cpp:106:23
    apache#4 0x55b5e767ca7a in doris::vectorized::AsyncResultWriter::start_writer(doris::RuntimeState*, doris::RuntimeProfile*)::$_0::operator()() const /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/vec/sink/writer/async_result_writer.cpp:93:5
    apache#5 0x55b5e767ca7a in void std::__invoke_impl(std::__invoke_other, doris::vectorized::AsyncResultWriter::start_writer(doris::RuntimeState*, doris::RuntimeProfile*)::$_0&) /var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/invoke.h:61:14
    apache#6 0x55b5e767ca7a in std::enable_if, void>::type std::__invoke_r(doris::vectorized::AsyncResultWriter::start_writer(doris::RuntimeState*, doris::RuntimeProfile*)::$_0&) /var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/invoke.h:111:2
    apache#7 0x55b5e767ca7a in std::_Function_handler::_M_invoke(std::_Any_data const&) /var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:291:9
    apache#8 0x55b5b50496a7 in doris::ThreadPool::dispatch_thread() /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/util/threadpool.cpp:602:24
    apache#9 0x55b5b50206de in std::function::operator()() const /var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:560:9
    apache#10 0x55b5b50206de in doris::Thread::supervise_thread(void*) /mnt/disk3/pipeline/repo/selectdb-core_branch-selectdb-doris-3.1/selectdb-core/be/src/util/thread.cpp:498:5
```
seawinde pushed a commit that referenced this pull request Feb 5, 2026
…lain verbose (apache#60308)

### What problem does this PR solve?
Problem Summary:
This PR enhances the output of EXPLAIN VERBOSE for File Scan nodes by
adding the following metrics:
`dataFileNum=xxx, deleteFileNum=xxx, deleteSplitNum=xxx`
Especially useful for iceberg/paimon/hive acid

These metrics provide more visibility into the underlying file and split
layout, helping users better tune parameters and control query
performance.
Details:
`dataFileNum` : The number of distinct data files that need to be read.
This is not equivalent to the number of splits, since a single data file
can be divided into multiple splits.

`deleteFileNum` : The number of distinct delete files that need to be
read.

`deleteSplitNum` : Added because the relationship between data files and
delete files is many-to-many:
one data file may be associated with multiple delete files
one delete file may apply to multiple data files
Using deleteSplitNum / dataSplitNum, users can estimate the average
number of delete splits that need to be read per data split.

Example:
```
mysql> explain verbose select * from iceberg.format_v3.dv_test_1w;
+-----------------------------------------------------------------------------------------------------------------------------------------------+
| Explain String(Nereids Planner)                                                                                                               |
+-----------------------------------------------------------------------------------------------------------------------------------------------+
| PLAN FRAGMENT 0                                                                                                                               |
|   OUTPUT EXPRS:                                                                                                                               |
|     id[#0]                                                                                                                                    |
|     grp[#1]                                                                                                                                   |
|     value[#2]                                                                                                                                 |
|     ts[apache#3]                                                                                                                                    |
|   PARTITION: RANDOM                                                                                                                           |
|                                                                                                                                               |
|   HAS_COLO_PLAN_NODE: false                                                                                                                   |
|                                                                                                                                               |
|   VRESULT SINK                                                                                                                                |
|      MYSQL_PROTOCOL                                                                                                                           |
|                                                                                                                                               |
|   0:VICEBERG_SCAN_NODE(32)                                                                                                                    |
|      table: iceberg.format_v3.dv_test_1w                                                                                                      |
|      inputSplitNum=220, totalFileSize=720774, scanRanges=220                                                                                  |
|      partition=0/0                                                                                                                            |
|      backends:                                                                                                                                |
|        1769590309070                                                                                                                          |
|          s3://warehouse/wh/format_v3/dv_test_1w/data/00004-51-fc462f9a-d42a-404d-adfc-c8d2781c8d04-0-00001.parquet start: 4 length: 2672      |
|          s3://warehouse/wh/format_v3/dv_test_1w/data/00003-50-fc462f9a-d42a-404d-adfc-c8d2781c8d04-0-00001.parquet start: 4 length: 2852      |
|          s3://warehouse/wh/format_v3/dv_test_1w/data/00000-47-fc462f9a-d42a-404d-adfc-c8d2781c8d04-0-00001.parquet start: 4 length: 2894      |
|          ... other 216 files ...                                                                                                              |
|          s3://warehouse/wh/format_v3/dv_test_1w/data/00001-48-fc462f9a-d42a-404d-adfc-c8d2781c8d04-0-00001.parquet start: 58397 length: 13894 |
|          dataFileNum=10, deleteFileNum=1 deleteSplitNum=220                                                                               |
|      cardinality=33334, numNodes=1                                                                                                            |
|      pushdown agg=NONE                                                                                                                        |
|      tuple ids: 0                                                                                                                             |
|                                                                                                                                               |
| Tuples:                                                                                                                                       |
| TupleDescriptor{id=0, tbl=dv_test_1w}                                                                                                         |
|   SlotDescriptor{id=0, col=id, colUniqueId=1, type=bigint, nullable=true, isAutoIncrement=false, subColPath=null, virtualColumn=null}         |
|   SlotDescriptor{id=1, col=grp, colUniqueId=2, type=int, nullable=true, isAutoIncrement=false, subColPath=null, virtualColumn=null}           |
|   SlotDescriptor{id=2, col=value, colUniqueId=3, type=int, nullable=true, isAutoIncrement=false, subColPath=null, virtualColumn=null}         |
|   SlotDescriptor{id=3, col=ts, colUniqueId=4, type=datetimev2(6), nullable=true, isAutoIncrement=false, subColPath=null, virtualColumn=null}  |
|                                                                                                                                               |
|                                                                                                                                               |
|                                                                                                                                               |
|                                                                                                                                               |
| ========== STATISTICS ==========                                                                                                              |
+-----------------------------------------------------------------------------------------------------------------------------------------------+
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants