Skip to content

Conversation

@hello-stephen
Copy link
Owner

@hello-stephen hello-stephen commented Oct 31, 2025

What problem does this PR solve?

Issue Number: close #xxx

Related PR: #xxx

Problem Summary:

Release note

None

Check List (For Author)

  • Test

    • Regression test
    • Unit Test
    • Manual test (add detailed scripts or steps below)
    • No need to test or manual test. Explain why:
      • This is a refactor/code format and no logic has been changed.
      • Previous test can cover this change.
      • No code files have been changed.
      • Other reason
  • Behavior changed:

    • No.
    • Yes.
  • Does this need documentation?

    • No.
    • Yes.

Check List (For Reviewer who merge this PR)

  • Confirm the release note
  • Confirm test cases
  • Confirm document
  • Add branch pick label

Description by Korbit AI

What change is being made?

Fix a grammatical error in a comment in be/src/common/config.h (change "If you want to modify the value of config" to "If you want a to modify the value of config" → corrected to "If you want to modify the value of config").

Why are these changes being made?

Provide a clear and correct documentation comment; no behavioral changes. This improves readability and correctness of the codebase comments.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

@hello-stephen
Copy link
Owner Author

run buildall

Copy link

@korbit-ai korbit-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've completed my review and didn't find any issues... but I did find this penguin.

 __
( o>
///\
\V_/_

Check out our docs on how you can make Korbit work best for you and your team.

Loving Korbit!? Share us on LinkedIn Reddit and X

@github-actions
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In build.sh line 220:
            BUILD_SPARK_DPP=1
            ^-------------^ SC2034 (warning): BUILD_SPARK_DPP appears unused. Verify use (or export if used externally).


In build.sh line 515:
FEAT+=($([[ -n "${WITH_TDE_DIR}" ]] && echo "+TDE" || echo "-TDE"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 516:
FEAT+=($([[ "${ENABLE_HDFS_STORAGE_VAULT:-OFF}" == "ON" ]] && echo "+HDFS_STORAGE_VAULT" || echo "-HDFS_STORAGE_VAULT"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 517:
FEAT+=($([[ ${BUILD_UI} -eq 1 ]] && echo "+UI" || echo "-UI"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 518:
FEAT+=($([[ "${BUILD_AZURE}" == "ON" ]] && echo "+AZURE_BLOB,+AZURE_STORAGE_VAULT" || echo "-AZURE_BLOB,-AZURE_STORAGE_VAULT"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 519:
FEAT+=($([[ ${BUILD_HIVE_UDF} -eq 1 ]] && echo "+HIVE_UDF" || echo "-HIVE_UDF"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 520:
FEAT+=($([[ ${BUILD_BE_JAVA_EXTENSIONS} -eq 1 ]] && echo "+BE_JAVA_EXTENSIONS" || echo "-BE_JAVA_EXTENSIONS"))
       ^-- SC2207 (warning): Prefer mapfile or read -a to split command output (or quote to avoid splitting).


In build.sh line 522:
export DORIS_FEATURE_LIST=$(IFS=','; echo "${FEAT[*]}")
       ^----------------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In build.sh line 666:
        -DENABLE_HDFS_STORAGE_VAULT=${ENABLE_HDFS_STORAGE_VAULT:-ON} \
                                    ^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        -DENABLE_HDFS_STORAGE_VAULT="${ENABLE_HDFS_STORAGE_VAULT:-ON}" \


In build.sh line 795:
    if [ "${TARGET_SYSTEM}" = "Darwin" ] || [ "${TARGET_SYSTEM}" = "Linux" ]; then
       ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                                            ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" = "Darwin" ]] || [[ "${TARGET_SYSTEM}" = "Linux" ]]; then


In build.sh line 921:
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'x86_64' ]]; then
                                                  ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'x86_64' ]]; then


In build.sh line 925:
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "$TARGET_ARCH" == 'aarch64' ]]; then
                                                    ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    elif [[ "${TARGET_SYSTEM}" == 'Linux' ]] && [[ "${TARGET_ARCH}" == 'aarch64' ]]; then


In cloud/script/start.sh line 59:
  source "${custom_start}" 
         ^---------------^ SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.


In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 166:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 167:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 173:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 209:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 210:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 24:
    [ -e "$file" ] || continue
    ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    [[ -e "${file}" ]] || continue


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 25:
    tar -xzvf "$file" -C "$AUX_LIB"
               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    tar -xzvf "${file}" -C "${AUX_LIB}"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 38:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
if [[ ${NEED_LOAD_DATA} = "0" ]]; then
      ^---------------^ SC2154 (warning): NEED_LOAD_DATA is referenced but not assigned.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 49:
if [[ ${enablePaimonHms} == "true" ]]; then
      ^----------------^ SC2154 (warning): enablePaimonHms is referenced but not assigned.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 54:
    echo "Script: create_paimon_table.hql executed in $EXECUTION_TIME seconds"
                                                      ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Script: create_paimon_table.hql executed in ${EXECUTION_TIME} seconds"


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 64:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${LOAD_PARALLEL}" -I {} bash -ec '
                                                                      ^--------------^ SC2154 (warning): LOAD_PARALLEL is referenced but not assigned.
                                                                                                       ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 119:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${LOAD_PARALLEL}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                                    ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 22:
find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 | \
     ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
find "${CUR_DIR}"/data -type f -name "*.tar.gz" -print0 | \


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 23:
xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
                ^--------------^ SC2154 (warning): LOAD_PARALLEL is referenced but not assigned.
                                          ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 33:
    cd ${CUR_DIR}/
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 34:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/tpch1.db.tar.gz
                    ^-------------^ SC2154 (warning): s3BucketName is referenced but not assigned.
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2154 (warning): s3Endpoint is referenced but not assigned.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/tpch1.db.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 45:
    cd ${CUR_DIR}/
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 46:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/tvf_data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/tvf_data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 58:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 70:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 82:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 94:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 106:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/test_tvf/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/test_tvf/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 118:
    curl -O https://${s3BucketName}.${s3Endpoint}/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz
                    ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                    ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    curl -O https://"${s3BucketName}"."${s3Endpoint}"/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz


In docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh line 144:
cd ${CUR_DIR}/auxlib
   ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
cd "${CUR_DIR}"/auxlib


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 74:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 81:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 83:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 90:
if [[ ${enablePaimonHms} == "true" ]]; then
      ^----------------^ SC2154 (warning): enablePaimonHms is referenced but not assigned.


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 27:
echo "[polaris-init] Waiting for Polaris health check at http://$HOST:$PORT/q/health ..."
                                                                ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                      ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Waiting for Polaris health check at http://${HOST}:${PORT}/q/health ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 28:
for i in $(seq 1 120); do
^-^ SC2034 (warning): i appears unused. Verify use (or export if used externally).


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 29:
  if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  if curl -sSf "http://${HOST}:8182/q/health" >/dev/null; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 38:
  -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/catalog/v1/oauth/tokens" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 40:
  -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
                                              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -d "grant_type=client_credentials&client_id=${USER}&client_secret=${PASS}&scope=PRINCIPAL_ROLE:ALL")


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 43:
TOKEN=$(printf "%s" "$TOKEN_JSON" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')
                     ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
TOKEN=$(printf "%s" "${TOKEN_JSON}" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 45:
if [ -z "$TOKEN" ]; then
         ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ -z "${TOKEN}" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 46:
  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
                                                                      ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: ${TOKEN_JSON}" >&2


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 50:
echo "[polaris-init] Creating catalog '$CATALOG' with base '$BASE_LOCATION' ..."
                                       ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                            ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Creating catalog '${CATALOG}' with base '${BASE_LOCATION}' ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 53:
  "name": "$CATALOG",
           ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  "name": "${CATALOG}",


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 56:
    "default-base-location": "$BASE_LOCATION",
                              ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    "default-base-location": "${BASE_LOCATION}",


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 66:
    "allowedLocations": ["$BASE_LOCATION"]
                          ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    "allowedLocations": ["${BASE_LOCATION}"]


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 74:
  -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/management/v1/catalogs" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 75:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 77:
  -d "$CREATE_PAYLOAD")
      ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -d "${CREATE_PAYLOAD}")


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 79:
if [ "$HTTP_CODE" = "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" = "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 81:
elif [ "$HTTP_CODE" = "409" ]; then
        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
elif [ "${HTTP_CODE}" = "409" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 84:
  echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Create catalog failed (HTTP ${HTTP_CODE}):"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 89:
echo "[polaris-init] Setting up permissions for catalog '$CATALOG' ..."
                                                         ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
echo "[polaris-init] Setting up permissions for catalog '${CATALOG}' ..."


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 94:
  -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/catalogs/${CATALOG}/catalog-roles/catalog_admin/grants" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 95:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 99:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 100:
  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 107:
  -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X POST "http://${HOST}:${PORT}/api/management/v1/principal-roles" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 108:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 112:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ] && [ "$HTTP_CODE" != "409" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ] && [ "${HTTP_CODE}" != "409" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 113:
  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
                                                                          ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 120:
  -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                           ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/principal-roles/data_engineer/catalog-roles/${CATALOG}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 121:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 125:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 126:
  echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
                                                              ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to connect roles (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 133:
  -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                       ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -X PUT "http://${HOST}:${PORT}/api/management/v1/principals/root/principal-roles" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 134:
  -H "Authorization: Bearer $TOKEN" \
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  -H "Authorization: Bearer ${TOKEN}" \


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 138:
if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
      ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [ "${HTTP_CODE}" != "200" ] && [ "${HTTP_CODE}" != "201" ]; then


In docker/thirdparties/docker-compose/polaris/init-catalog.sh line 139:
  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
                                                                                  ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP ${HTTP_CODE})"


In docker/thirdparties/docker-compose/ranger/ranger-admin/ranger-entrypoint.sh line 24:
cd $RANGER_HOME
   ^----------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.
   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cd "${RANGER_HOME}"


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 16:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 19:
if [ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
           ^------------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.

Did you mean: 
if [[ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]]; then


In docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh line 15:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/run-thirdparties-docker.sh line 131:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 165:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 350:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 359:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 364:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 383:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 385:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 387:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 398:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 400:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 410:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 411:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 413:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 424:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 426:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 493:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 494:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 495:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 504:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 506:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 508:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 512:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 527:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 528:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 529:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 540:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 541:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 546:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 551:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 557:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 595:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 597:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 609:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 616:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 656:
    . "${POLARIS_DIR}/polaris_settings.env"
      ^-- SC1091 (info): Not following: ./polaris_settings.env: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/run-thirdparties-docker.sh line 705:
if [[ "$NEED_LOAD_DATA" -eq 1 ]]; then
       ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${NEED_LOAD_DATA}" -eq 1 ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 711:
if [[ $need_prepare_hive_data -eq 1 ]]; then
      ^---------------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ ${need_prepare_hive_data} -eq 1 ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 832:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 833:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 834:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 836:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true


In run-be-ut.sh line 141:
    WITH_TDE_DIR        -- ${WITH_TDE_DIR}
                           ^-------------^ SC2154 (warning): WITH_TDE_DIR is referenced but not assigned.


In run-cloud-ut.sh line 195:
    -DENABLE_HDFS_STORAGE_VAULT=${ENABLE_HDFS_STORAGE_VAULT:-ON} \
                                ^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    -DENABLE_HDFS_STORAGE_VAULT="${ENABLE_HDFS_STORAGE_VAULT:-ON}" \


In thirdparty/build-thirdparty.sh line 1343:
    -DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS -Wno-elaborated-enum-base" \
                       ^--------------^ SC2154 (warning): CMAKE_CXX_FLAGS is referenced but not assigned.
                       ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    -DCMAKE_CXX_FLAGS="${CMAKE_CXX_FLAGS} -Wno-elaborated-enum-base" \


In thirdparty/build-thirdparty.sh line 1928:
    cp -r ${TP_SOURCE_DIR}/${JINDOFS_SOURCE}/* "${TP_INSTALL_DIR}/jindofs_libs/"
          ^--------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                           ^---------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    cp -r "${TP_SOURCE_DIR}"/"${JINDOFS_SOURCE}"/* "${TP_INSTALL_DIR}/jindofs_libs/"


In thirdparty/download-thirdparty.sh line 610:
    cd $TP_SOURCE_DIR/$CCTZ_SOURCE
       ^------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
       ^------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                      ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    cd "${TP_SOURCE_DIR}"/"${CCTZ_SOURCE}"


In thirdparty/download-thirdparty.sh line 611:
    if [[ ! -f "$PATCHED_MARK" ]] ; then
                ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ ! -f "${PATCHED_MARK}" ]] ; then


In thirdparty/download-thirdparty.sh line 613:
        touch "$PATCHED_MARK"
               ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        touch "${PATCHED_MARK}"


In tools/lzo/build.sh line 1:
# Licensed to the Apache Software Foundation (ASF) under one
^-- SC2148 (error): Tips depend on target shell and yours is unknown. Add a shebang or a 'shell' directive.


In tools/lzo/build.sh line 20:
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I${DORIS_THIRDPARTY}/installed/include -L${DORIS_THIRDPARTY}/installed/lib -llzo2 -std=c++17
                                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                     ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
g++ -o lzo_writer lzo_writer.cpp -I. -Isrc -I"${DORIS_THIRDPARTY}"/installed/include -L"${DORIS_THIRDPARTY}"/installed/lib -llzo2 -std=c++17

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC1128 -- The shebang must be on the first ...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- build.sh.orig
+++ build.sh
@@ -519,7 +519,10 @@
 FEAT+=($([[ ${BUILD_HIVE_UDF} -eq 1 ]] && echo "+HIVE_UDF" || echo "-HIVE_UDF"))
 FEAT+=($([[ ${BUILD_BE_JAVA_EXTENSIONS} -eq 1 ]] && echo "+BE_JAVA_EXTENSIONS" || echo "-BE_JAVA_EXTENSIONS"))
 
-export DORIS_FEATURE_LIST=$(IFS=','; echo "${FEAT[*]}")
+export DORIS_FEATURE_LIST=$(
+    IFS=','
+    echo "${FEAT[*]}"
+)
 echo "Feature List: ${DORIS_FEATURE_LIST}"
 
 # Clean and build generated code
@@ -793,12 +796,12 @@
     mkdir -p "${DORIS_OUTPUT}/fe/plugins/java_extensions/"
 
     if [ "${TARGET_SYSTEM}" = "Darwin" ] || [ "${TARGET_SYSTEM}" = "Linux" ]; then
-      mkdir -p "${DORIS_OUTPUT}/fe/arthas"
-      rm -rf "${DORIS_OUTPUT}/fe/arthas/*"
-      unzip -o "${DORIS_OUTPUT}/fe/lib/arthas-packaging-*.jar" arthas-bin.zip -d "${DORIS_OUTPUT}/fe/arthas/"
-      unzip -o "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip" -d "${DORIS_OUTPUT}/fe/arthas/"
-      rm "${DORIS_OUTPUT}/fe/arthas/math-game.jar"
-      rm "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip"
+        mkdir -p "${DORIS_OUTPUT}/fe/arthas"
+        rm -rf "${DORIS_OUTPUT}/fe/arthas/*"
+        unzip -o "${DORIS_OUTPUT}/fe/lib/arthas-packaging-*.jar" arthas-bin.zip -d "${DORIS_OUTPUT}/fe/arthas/"
+        unzip -o "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip" -d "${DORIS_OUTPUT}/fe/arthas/"
+        rm "${DORIS_OUTPUT}/fe/arthas/math-game.jar"
+        rm "${DORIS_OUTPUT}/fe/arthas/arthas-bin.zip"
     fi
 fi
 
--- cloud/script/start.sh.orig
+++ cloud/script/start.sh
@@ -54,9 +54,9 @@
 fi
 # echo "$@" "daemonized=${daemonized}"}
 
-custom_start="${DORIS_HOME}/bin/custom_start.sh" 
+custom_start="${DORIS_HOME}/bin/custom_start.sh"
 if [[ -f "${custom_start}" ]]; then
-  source "${custom_start}" 
+    source "${custom_start}"
 fi
 enable_hdfs=${enable_hdfs:-1}
 process_name="${process_name:-doris_cloud}"
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/default/account_fund/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/account_fund/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/hive01/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/hive01/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/sale_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/sale_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/string_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/string_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/student/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/student/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test1/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test1/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/default/test_hive_doris/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/default/test_hive_doris/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/default/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/default/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type3/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type3/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter3/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_text_complex_type_delimiter3/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_nested_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_nested_types/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/par_fields_in_file_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_bigint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_bigint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_boolean/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_boolean/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_char/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_char/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_date/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_date/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_decimal/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_decimal/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_double/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_double/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_float/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_float/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_int/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_int/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_smallint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_smallint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_string/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_string/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_timestamp/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_timestamp/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_tinyint/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_tinyint/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_varchar/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_alter_column_to_varchar/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_nested_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_nested_types/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_predicate_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_predicate_table/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_1/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_1/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_2/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_location_2/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_chinese_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_same_db_table_name/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_same_db_table_name/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_special_char_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_hive_special_char_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_mixed_par_locations_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_multi_langs_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_text/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_truncate_char_or_varchar_columns_text/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -9,4 +9,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_columns/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_columns/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_one_column/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/text_partitioned_one_column/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/timestamp_with_time_zone/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/timestamp_with_time_zone/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_orc/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_origin/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_origin/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/type_change_parquet/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/multi_catalog/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/multi_catalog/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/bigint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/bigint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/char_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/char_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/date_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/date_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/decimal_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/decimal_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/double_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/double_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/float_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/float_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/int_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/int_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/smallint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/smallint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/string_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/tinyint_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/tinyint_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/partition_type/varchar_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/partition_type/varchar_partition/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/partition_type/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/partition_type/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -3,11 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/regression/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/regression/
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -3,10 +3,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/statistics/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/statistics/
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/stats/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/stats/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/statistics/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/statistics/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/test/hive_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/test/hive_test/run.sh
@@ -3,7 +3,6 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 
-
 hadoop fs -mkdir -p /user/doris/suites/test/
 hadoop fs -put "${CUR_DIR}"/data/* /user/doris/suites/test/
 
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -18,7 +18,6 @@
 
 set -e -x
 
-
 AUX_LIB="/mnt/scripts/auxlib"
 for file in "${AUX_LIB}"/*.tar.gz; do
     [ -e "$file" ] || continue
@@ -33,7 +32,6 @@
 # start metastore
 nohup /opt/hive/bin/hive --service metastore &
 
-
 # wait metastore start
 while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
     sleep 5s
@@ -73,7 +71,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -86,7 +83,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/paimon1 /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 ## put tvf_data
 if [[ -z "$(ls /mnt/scripts/tvf_data)" ]]; then
     echo "tvf_data does not exist"
@@ -99,7 +95,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 wait "${hadoop_put_pids[@]}"
 if [[ -z "$(hadoop fs -ls /user/doris/paimon1)" ]]; then
--- docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/prepare-hive-data.sh
@@ -19,8 +19,8 @@
 
 CUR_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
 # Extract all tar.gz files under the repo
-find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 | \
-xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
+find ${CUR_DIR}/data -type f -name "*.tar.gz" -print0 |
+    xargs -0 -n1 -P"${LOAD_PARALLEL}" bash -c '
   f="$0"
   echo "Extracting hive data $f"
   dir=$(dirname "$f")
@@ -145,4 +145,3 @@
 for jar in "${jars[@]}"; do
     curl -O "https://${s3BucketName}.${s3Endpoint}/regression/docker/hive3/${jar}"
 done
-
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -56,7 +56,6 @@
 curl -O https://s3BucketName.s3Endpoint/regression/docker/hive3/paimon-hive-connector-3.1-1.3-SNAPSHOT.jar
 curl -O https://s3BucketName.s3Endpoint/regression/docker/hive3/gcs-connector-hadoop3-2.2.24-shaded.jar
 
-
 /usr/local/hadoop-run.sh &
 
 # check healthy hear
@@ -86,7 +85,7 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 if [[ ${enablePaimonHms} == "true" ]]; then
     echo "Creating Paimon HMS catalog and table"
     hadoop fs -put /tmp/paimon_data/* /user/hive/warehouse/
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/docker-compose/polaris/init-catalog.sh.orig
+++ docker/thirdparties/docker-compose/polaris/init-catalog.sh
@@ -26,29 +26,30 @@
 
 echo "[polaris-init] Waiting for Polaris health check at http://$HOST:$PORT/q/health ..."
 for i in $(seq 1 120); do
-  if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
-    break
-  fi
-  sleep 2
+    if curl -sSf "http://$HOST:8182/q/health" >/dev/null; then
+        break
+    fi
+    sleep 2
 done
 
 echo "[polaris-init] Fetching OAuth token via client_credentials ..."
 # Try to obtain token using correct OAuth endpoint
 TOKEN_JSON=$(curl -sS \
-  -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
-  -H 'Content-Type: application/x-www-form-urlencoded' \
-  -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
+    -X POST "http://$HOST:$PORT/api/catalog/v1/oauth/tokens" \
+    -H 'Content-Type: application/x-www-form-urlencoded' \
+    -d "grant_type=client_credentials&client_id=$USER&client_secret=$PASS&scope=PRINCIPAL_ROLE:ALL")
 
 # Extract access_token field
 TOKEN=$(printf "%s" "$TOKEN_JSON" | sed -n 's/.*"access_token"\s*:\s*"\([^"]*\)".*/\1/p')
 
 if [ -z "$TOKEN" ]; then
-  echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
-  exit 1
+    echo "[polaris-init] ERROR: Failed to obtain OAuth token. Response: $TOKEN_JSON" >&2
+    exit 1
 fi
 
 echo "[polaris-init] Creating catalog '$CATALOG' with base '$BASE_LOCATION' ..."
-CREATE_PAYLOAD=$(cat <<JSON
+CREATE_PAYLOAD=$(
+    cat <<JSON
 {
   "name": "$CATALOG",
   "type": "INTERNAL",
@@ -71,19 +72,19 @@
 
 # Try create; on 409 Conflict, treat as success
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d "$CREATE_PAYLOAD")
+    -X POST "http://$HOST:$PORT/api/management/v1/catalogs" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d "$CREATE_PAYLOAD")
 
 if [ "$HTTP_CODE" = "201" ]; then
-  echo "[polaris-init] Catalog created."
+    echo "[polaris-init] Catalog created."
 elif [ "$HTTP_CODE" = "409" ]; then
-  echo "[polaris-init] Catalog already exists. Skipping."
+    echo "[polaris-init] Catalog already exists. Skipping."
 else
-  echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
-  cat /tmp/resp.json || true
-  exit 1
+    echo "[polaris-init] Create catalog failed (HTTP $HTTP_CODE):"
+    cat /tmp/resp.json || true
+    exit 1
 fi
 
 echo "[polaris-init] Setting up permissions for catalog '$CATALOG' ..."
@@ -91,55 +92,54 @@
 # Create a catalog admin role grants
 echo "[polaris-init] Creating catalog admin role grants ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"grant":{"type":"catalog", "privilege":"CATALOG_MANAGE_CONTENT"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/catalogs/$CATALOG/catalog-roles/catalog_admin/grants" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"grant":{"type":"catalog", "privilege":"CATALOG_MANAGE_CONTENT"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to create catalog admin grants (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Create a data engineer role
 echo "[polaris-init] Creating data engineer role ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"principalRole":{"name":"data_engineer"}}')
+    -X POST "http://$HOST:$PORT/api/management/v1/principal-roles" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"principalRole":{"name":"data_engineer"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ] && [ "$HTTP_CODE" != "409" ]; then
-  echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to create data engineer role (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Connect the roles
 echo "[polaris-init] Connecting roles ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"catalogRole":{"name":"catalog_admin"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/principal-roles/data_engineer/catalog-roles/$CATALOG" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"catalogRole":{"name":"catalog_admin"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to connect roles (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 # Give root the data engineer role
 echo "[polaris-init] Assigning data engineer role to root ..."
 HTTP_CODE=$(curl -sS -o /tmp/resp.json -w "%{http_code}" \
-  -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
-  -H "Authorization: Bearer $TOKEN" \
-  -H "Content-Type: application/json" \
-  -d '{"principalRole": {"name":"data_engineer"}}')
+    -X PUT "http://$HOST:$PORT/api/management/v1/principals/root/principal-roles" \
+    -H "Authorization: Bearer $TOKEN" \
+    -H "Content-Type: application/json" \
+    -d '{"principalRole": {"name":"data_engineer"}}')
 
 if [ "$HTTP_CODE" != "200" ] && [ "$HTTP_CODE" != "201" ]; then
-  echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
-  cat /tmp/resp.json || true
+    echo "[polaris-init] Warning: Failed to assign data engineer role to root (HTTP $HTTP_CODE)"
+    cat /tmp/resp.json || true
 fi
 
 echo "[polaris-init] Permissions setup completed."
 echo "[polaris-init] Done."
-
--- docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh
--- docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -51,7 +51,7 @@
 STOP=0
 NEED_RESERVE_PORTS=0
 export NEED_LOAD_DATA=1
-export LOAD_PARALLEL=$(( $(getconf _NPROCESSORS_ONLN) / 2 ))
+export LOAD_PARALLEL=$(($(getconf _NPROCESSORS_ONLN) / 2))
 
 if ! OPTS="$(getopt \
     -n "$0" \
@@ -205,7 +205,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -394,7 +394,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -413,14 +413,14 @@
         exit -1
     fi
     # before start it, you need to download parquet file package, see "README" in "docker-compose/hive/scripts/"
-    
+
     # generate hive-3x.yaml
     export IP_HOST=${IP_HOST}
     export CONTAINER_UID=${CONTAINER_UID}
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -446,12 +446,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data*.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data_paimon_101.zip \
-            && sudo unzip iceberg_data_paimon_101.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data_paimon_101.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data*.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data_paimon_101.zip &&
+                sudo unzip iceberg_data_paimon_101.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data_paimon_101.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -615,9 +615,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -683,12 +683,12 @@
     echo "RUN_ICEBERG_REST"
     # iceberg-rest with multiple cloud storage backends
     ICEBERG_REST_DIR=${ROOT}/docker-compose/iceberg-rest
-    
+
     # generate iceberg-rest.yaml
     export CONTAINER_UID=${CONTAINER_UID}
     . "${ROOT}"/docker-compose/iceberg-rest/iceberg-rest_settings.env
     envsubst <"${ICEBERG_REST_DIR}/docker-compose.yaml.tpl" >"${ICEBERG_REST_DIR}/docker-compose.yaml"
-    
+
     sudo docker compose -f "${ICEBERG_REST_DIR}/docker-compose.yaml" down
     if [[ "${STOP}" -ne 1 ]]; then
         # Start all three REST catalogs (S3, OSS, COS)
@@ -716,112 +716,112 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_iceberg.log 2>&1 &
+    start_iceberg >start_iceberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_ICEBERG_REST}" -eq 1 ]]; then
-    start_iceberg_rest > start_iceberg_rest.log 2>&1 &
+    start_iceberg_rest >start_iceberg_rest.log 2>&1 &
     pids["iceberg-rest"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_POLARIS}" -eq 1 ]]; then
-    start_polaris > start_polaris.log 2>&1 &
+    start_polaris >start_polaris.log 2>&1 &
     pids["polaris"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
 if [[ "${RUN_RANGER}" -eq 1 ]]; then
-    start_ranger > start_ranger.log 2>&1 &
+    start_ranger >start_ranger.log 2>&1 &
     pids["ranger"]=$!
 fi
 echo "waiting all dockers starting done"
--- run-be-ut.sh.orig
+++ run-be-ut.sh
@@ -107,7 +107,7 @@
         --gen_out)
             GEN_OUT='--gen_out'
             shift
-            ;;	    
+            ;;
         -f | --filter)
             FILTER="--gtest_filter=$2"
             shift 2
--- run-regression-test.sh.orig
+++ run-regression-test.sh
--- thirdparty/build-thirdparty.sh.orig
+++ thirdparty/build-thirdparty.sh
@@ -520,7 +520,7 @@
 
     rm -rf CMakeCache.txt CMakeFiles/
     "${CMAKE_CMD}" ../ -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-      -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_POSITION_INDEPENDENT_CODE=On
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_POSITION_INDEPENDENT_CODE=On
     # -DCMAKE_CXX_FLAGS="$warning_uninitialized"
 
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
@@ -1272,7 +1272,7 @@
     rm -rf CMakeCache.txt CMakeFiles/
 
     "${CMAKE_CMD}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-     -G "${GENERATOR}" -DBUILD_SHARED_LIBS=FALSE -DFMT_TEST=OFF -DFMT_DOC=OFF -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" ..
+        -G "${GENERATOR}" -DBUILD_SHARED_LIBS=FALSE -DFMT_TEST=OFF -DFMT_DOC=OFF -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" ..
     "${BUILD_SYSTEM}" -j"${PARALLEL}"
     "${BUILD_SYSTEM}" install
 }
@@ -1340,8 +1340,8 @@
 
     # -Wno-elaborated-enum-base to make C++20 on MacOS happy
     "${CMAKE_CMD}" -G "${GENERATOR}" \
-    -DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS -Wno-elaborated-enum-base" \
-    -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DBUILD_TESTING=OFF ..
+        -DCMAKE_CXX_FLAGS="$CMAKE_CXX_FLAGS -Wno-elaborated-enum-base" \
+        -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DBUILD_TESTING=OFF ..
     "${BUILD_SYSTEM}" -j "${PARALLEL}" install
 }
 
@@ -1775,7 +1775,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
     "${BUILD_SYSTEM}" install
 }
@@ -1847,7 +1847,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
     MACHINE_TYPE="$(uname -m)"
     if [[ "${MACHINE_TYPE}" == "aarch64" || "${MACHINE_TYPE}" == 'arm64' ]]; then
         CFLAGS="--target=aarch64-linux-gnu -march=armv8-a+crc" NEON64_CFLAGS=" "
@@ -1876,7 +1876,7 @@
         AZURE_MANIFEST_DIR="."
 
         "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-        -DCMAKE_CXX_FLAGS="-Wno-maybe-uninitialized" -DDISABLE_RUST_IN_BUILD=ON -DVCPKG_MANIFEST_MODE=ON -DVCPKG_OVERLAY_PORTS="${azure_dir}/${AZURE_PORTS}" -DVCPKG_MANIFEST_DIR="${azure_dir}/${AZURE_MANIFEST_DIR}" -DWARNINGS_AS_ERRORS=FALSE -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+            -DCMAKE_CXX_FLAGS="-Wno-maybe-uninitialized" -DDISABLE_RUST_IN_BUILD=ON -DVCPKG_MANIFEST_MODE=ON -DVCPKG_OVERLAY_PORTS="${azure_dir}/${AZURE_PORTS}" -DVCPKG_MANIFEST_DIR="${azure_dir}/${AZURE_MANIFEST_DIR}" -DWARNINGS_AS_ERRORS=FALSE -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
         "${BUILD_SYSTEM}" -j "${PARALLEL}"
         "${BUILD_SYSTEM}" install
     fi
@@ -1892,7 +1892,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DDRAGONBOX_INSTALL_TO_CHARS=ON ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DDRAGONBOX_INSTALL_TO_CHARS=ON ..
 
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
     "${BUILD_SYSTEM}" install
@@ -1938,7 +1938,7 @@
     cd "${BUILD_DIR}"
 
     "${CMAKE_CMD}" -G "${GENERATOR}" -DCMAKE_POLICY_VERSION_MINIMUM=3.5 \
-    -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
+        -DCMAKE_INSTALL_PREFIX="${TP_INSTALL_DIR}" -DCMAKE_BUILD_TYPE=Release ..
     "${BUILD_SYSTEM}" -j "${PARALLEL}"
     "${BUILD_SYSTEM}" install
 
--- thirdparty/download-thirdparty.sh.orig
+++ thirdparty/download-thirdparty.sh
@@ -606,9 +606,9 @@
     echo "Finished patching ${GRPC_SOURCE}"
 fi
 
-if [[ " ${TP_ARCHIVES[*]} " =~ " CCTZ " ]] ; then
+if [[ " ${TP_ARCHIVES[*]} " =~ " CCTZ " ]]; then
     cd $TP_SOURCE_DIR/$CCTZ_SOURCE
-    if [[ ! -f "$PATCHED_MARK" ]] ; then
+    if [[ ! -f "$PATCHED_MARK" ]]; then
         patch -p1 <"${TP_PATCH_DIR}/cctz-civil-cache.patch"
         touch "$PATCHED_MARK"
     fi
--- thirdparty/vars.sh.orig
+++ thirdparty/vars.sh
@@ -558,7 +558,6 @@
 FAISS_SOURCE="faiss-1.10.0"
 FAISS_MD5SUM="f31edf2492808b27cc963d0ab316a205"
 
-
 # all thirdparties which need to be downloaded is set in array TP_ARCHIVES
 export TP_ARCHIVES=(
     'LIBEVENT'
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


hello-stephen pushed a commit that referenced this pull request Nov 27, 2025
…ich belongs to an agg materialized view (apache#58038)

### What problem does this PR solve?

Issue Number: close apache#58037

Problem Summary:

```
#0  0x00007f9aca4a3f8c in __pthread_kill_implementation () from /lib64/libc.so.6
#1  0x00007f9aca454a26 in raise () from /lib64/libc.so.6
#2  0x00007f9aca43d87c in abort () from /lib64/libc.so.6
#3  0x0000561dc3d1ea1d in ?? ()
#4  0x0000561dc3d1105a in google::LogMessage::Fail() ()
#5  0x0000561dc3d14146 in google::LogMessage::SendToLog() ()
#6  0x0000561dc3d10b90 in google::LogMessage::Flush() ()
#7  0x0000561dc3d14989 in google::LogMessageFatal::~LogMessageFatal() ()
#8  0x0000561db854c996 in assert_cast<doris::vectorized::ColumnStr<unsigned int> const&, (TypeCheckOnRelease)1, doris::vectorized::IColumn const&>(doris::vectorized::IColumn const&)::{lambda(auto:1&&)#1}::operator()<doris::vectorized::IColumn const&>(doris::vectorized::IColumn const&) const
    (this=this@entry=0x7f9658ccc1f8, from=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/common/assert_cast.h:58
#9  0x0000561db854c7d7 in assert_cast<doris::vectorized::ColumnStr<unsigned int> const&, (TypeCheckOnRelease)1, doris::vectorized::IColumn const&> (from=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/common/assert_cast.h:73
#10 0x0000561db854bb0b in doris::vectorized::ColumnStr<unsigned int>::compare_at (this=0x7f957a14e2c0, n=1159288, m=6, rhs_=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/columns/column_string.h:526
#11 0x0000561dbe108c6b in doris::vectorized::GenericComparisonImpl<doris::vectorized::EqualsOp<int, int> >::vector_constant (a=..., b=..., c=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:112
#12 doris::vectorized::FunctionComparison<doris::vectorized::EqualsOp, doris::vectorized::NameEquals>::execute_generic_identical_types (
    this=<optimized out>, block=..., result=result@entry=10, c0=0x7f957a14e2c0, c1=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:506
#13 0x0000561dbdf9e97e in doris::vectorized::FunctionComparison<doris::vectorized::EqualsOp, doris::vectorized::NameEquals>::execute_generic (
    this=0x7f96d6fb1b90, block=..., result=10, c0=..., c1=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:517
#14 doris::vectorized::FunctionComparison<doris::vectorized::EqualsOp, doris::vectorized::NameEquals>::execute_impl (this=0x7f96d6fb1b90, 
    context=<optimized out>, block=..., arguments=..., result=10, input_rows_count=104)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/functions_comparison.h:707
#15 0x0000561dbdcf1b8f in doris::vectorized::DefaultExecutable::execute_impl (this=<optimized out>, context=0x6, block=..., arguments=..., 
    result=1, input_rows_count=104) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.h:472
#16 0x0000561dbeea76ae in doris::vectorized::PreparedFunctionImpl::_execute_skipped_constant_deal (this=this@entry=0x7f99f62a65d0, 
    context=context@entry=0x7f99f6442b00, block=..., args=..., result=result@entry=10, input_rows_count=104, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:121
#17 0x0000561dbeea4ce8 in doris::vectorized::PreparedFunctionImpl::execute_without_low_cardinality_columns (this=0x7f99f62a65d0, 
    context=0x7f99f6442b00, block=..., args=..., result=10, input_rows_count=104, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:246
#18 doris::vectorized::PreparedFunctionImpl::default_implementation_for_nulls (this=this@entry=0x7f99f62a65d0, 
    context=context@entry=0x7f99f6442b00, block=..., args=..., result=result@entry=10, input_rows_count=104, dry_run=<optimized out>, 
    executed=0x7f9658ccc666) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:218
#19 0x0000561dbeea4e9c in doris::vectorized::PreparedFunctionImpl::_execute_skipped_constant_deal (this=0x7f99f62a65d0, context=0x7f99f6442b00, 
    block=..., args=..., result=10, input_rows_count=<optimized out>, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:112
#20 doris::vectorized::PreparedFunctionImpl::execute_without_low_cardinality_columns (this=0x7f99f62a65d0, context=0x7f99f6442b00, block=..., 
    args=..., result=10, input_rows_count=104, dry_run=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:246
apache#21 0x0000561dbeea4f66 in doris::vectorized::PreparedFunctionImpl::execute (this=0x11b078, context=0x6, block=..., args=..., result=1, 
    input_rows_count=104, dry_run=<optimized out>) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.cpp:252
apache#22 0x0000561dbdcf1500 in doris::vectorized::IFunctionBase::execute (this=<optimized out>, context=0x7f99f6442b00, block=..., arguments=..., 
    result=10, input_rows_count=104, dry_run=<optimized out>) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/functions/function.h:195
--Type <RET> for more, q to quit, c to continue without paging--c
apache#23 0x0000561dbdceccad in doris::vectorized::VectorizedFnCall::_do_execute (this=0x7f96f0fec510, context=0x7f957f09cdf0, block=0x7f957b02a3b0, 
    result_column_id=0x7f9658ccca14, args=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vectorized_fn_call.cpp:197
apache#24 0x0000561dbdced2c6 in doris::vectorized::VectorizedFnCall::execute (this=0x11b078, context=0x6, 
    block=0x7f9aca4a3f8c <__pthread_kill_implementation+268>, result_column_id=0x7f9658ccbe10)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vectorized_fn_call.cpp:212
apache#25 0x0000561dbdd1e51b in doris::vectorized::VExprContext::execute (this=0x7f957f09cdf0, 
    block=0x7f9aca4a3f8c <__pthread_kill_implementation+268>, block@entry=0x7f957b02a3b0, result_column_id=result_column_id@entry=0x7f9658ccca14)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vexpr_context.cpp:55
apache#26 0x0000561dbdd1fcb5 in doris::vectorized::VExprContext::execute_conjuncts (ctxs=..., filters=filters@entry=0x0, accept_null=false, 
    block=block@entry=0x7f957b02a3b0, result_filter=result_filter@entry=0x7f9658ccccc0, can_filter_all=0x7f9658cccbc7)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vexpr_context.cpp:174
apache#27 0x0000561dbdd2131f in doris::vectorized::VExprContext::execute_conjuncts_and_filter_block (ctxs=..., block=0x7f957b02a3b0, 
    columns_to_filter=..., column_to_keep=6, filter=...) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exprs/vexpr_context.cpp:354
apache#28 0x0000561db8f49450 in doris::segment_v2::SegmentIterator::_execute_common_expr (this=this@entry=0x7f954e20a000, 
    sel_rowid_idx=0x7f955c0d2000, selected_size=@0x7f9658ccce6e: 104, block=block@entry=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:2338
apache#29 0x0000561db8f482f8 in doris::segment_v2::SegmentIterator::_next_batch_internal (this=0x7f954e20a000, block=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:2230
apache#30 0x0000561db8f45212 in doris::segment_v2::SegmentIterator::next_batch(doris::vectorized::Block*)::$_0::operator()() const (
    this=<optimized out>) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:1953
apache#31 doris::segment_v2::SegmentIterator::next_batch (this=0x7f954e20a000, block=0x6)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/segment_iterator.cpp:1952
apache#32 0x0000561db8ee49bc in doris::segment_v2::LazyInitSegmentIterator::next_batch (this=0x7f953bc71f80, block=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/segment_v2/lazy_init_segment_iterator.h:44
apache#33 0x0000561db8dab844 in doris::BetaRowsetReader::next_block (this=0x7f9a4e215800, block=0x7f957b02a3b0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/olap/rowset/beta_rowset_reader.cpp:377
apache#34 0x0000561dc2c9413d in doris::vectorized::VCollectIterator::Level0Iterator::_refresh (this=0x7f953ba137a0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.h:256
apache#35 doris::vectorized::VCollectIterator::Level0Iterator::refresh_current_row (this=this@entry=0x7f953ba137a0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.cpp:509
apache#36 0x0000561dc2c93bf4 in doris::vectorized::VCollectIterator::Level0Iterator::init (this=0x7f953ba137a0, get_data_by_ref=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.cpp:461
apache#37 0x0000561dc2c91002 in doris::vectorized::VCollectIterator::build_heap (this=0x7f957a52bb30, rs_readers=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/vcollect_iterator.cpp:125
apache#38 0x0000561dc2c7e1f2 in doris::vectorized::BlockReader::_init_collect_iter (this=this@entry=0x7f957a52b400, read_params=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/block_reader.cpp:153
apache#39 0x0000561dc2c7f191 in doris::vectorized::BlockReader::init (this=<optimized out>, read_params=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/olap/block_reader.cpp:226
apache#40 0x0000561dc3937869 in doris::vectorized::NewOlapScanner::open (this=0x7f9a56270210, state=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/new_olap_scanner.cpp:252
apache#41 0x0000561dbdcc5413 in doris::vectorized::ScannerScheduler::_scanner_scan (ctx=..., scan_task=...)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:221
apache#42 0x0000561dbdcc62bd in doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}::operator()() const::{lambda()#1}::operator()() const (this=<optimized out>)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:154
apache#43 doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}::operator()() const (this=0x7f954e25d3c0)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:153
apache#44 std::__invoke_impl<void, doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&>(std::__invoke_other, doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&) (__f=...)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/invoke.h:61
apache#45 std::__invoke_r<void, doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&>(doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}&) (__fn=...)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/invoke.h:111
apache#46 std::_Function_handler<void (), doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>, std::shared_ptr<doris::vectorized::ScanTask>)::$_1::operator()() const::{lambda()#1}>::_M_invoke(std::_Any_data const&) (__functor=...)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:291
apache#47 0x0000561db962137a in doris::ThreadPool::dispatch_thread (this=0x7f9a50f9d380)
    at /data/home/lambxu/work/git/doris-3.1/doris/be/src/util/threadpool.cpp:602
apache#48 0x0000561db96159a1 in std::function<void ()>::operator()() const (this=0x11a791)
    at /data/home/lambxu/installs/ldb_toolchain_bak/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:560
apache#49 doris::Thread::supervise_thread (arg=0x7f969f569ce0) at /data/home/lambxu/work/git/doris-3.1/doris/be/src/util/thread.cpp:498
apache#50 0x00007f9aca4a2215 in start_thread () from /lib64/libc.so.6
apache#51 0x00007f9aca524bdc in clone3 () from /lib64/libc.so.6
```

Assume that 0,1,2,3,4, is key columns of an AGG mv, because the PreAgg
is OFF at scan node, the block will contain all key columns to merge
data in storage layer.

if we select 0,1 column, with 3,4 column in where clause, then the slot
ids should be 0,1,3,4, and column ids in conjuncts is the index of slot
ids.(which is 2 and 3)

But the plan use the key type of base table which is DUP key, treating
the AGG mv as a DUP mv, so these conjuncts are pushed down to the scan
node which belongs to an AGG mv, these conjuncts will pick the wrong
column 2 and 3 (which shoud be 4 and 5) in block to exucute.

So we should use the key type of mv but not the key type of base table.

### Release note

None

### Check List (For Author)

- Test <!-- At least one of them must be included. -->
    - [x] Regression test
    - [ ] Unit Test
    - [ ] Manual test (add detailed scripts or steps below)
    - [ ] No need to test or manual test. Explain why:
- [ ] This is a refactor/code format and no logic has been changed.
        - [ ] Previous test can cover this change.
        - [ ] No code files have been changed.
        - [ ] Other reason <!-- Add your reason?  -->

- Behavior changed:
    - [x] No.
    - [ ] Yes. <!-- Explain the behavior change -->

- Does this need documentation?
    - [x] No.
- [ ] Yes. <!-- Add document PR link here. eg:
apache/doris-website#1214 -->

### Check List (For Reviewer who merge this PR)

- [ ] Confirm the release note
- [ ] Confirm test cases
- [ ] Confirm document
- [ ] Add branch pick label <!-- Add branch pick label that this PR
should merge into -->
hello-stephen pushed a commit that referenced this pull request Jan 7, 2026
Related issue: apache#57884

```
MySQL [demo]> show frontends;
+-----------------------------------------+--------------+-------------+----------+-----------+---------+--------------------+----------+----------+-----------+------+-------+-------------------+---------------------+---------------------+----------+--------+------------------------+------------------+---------------------+
| Name                                    | Host         | EditLogPort | HttpPort | QueryPort | RpcPort | ArrowFlightSqlPort | Role     | IsMaster | ClusterId | Join | Alive | ReplayedJournalId | LastStartTime       | LastHeartbeat       | IsHelper | ErrMsg | Version                | CurrentConnected | LiveSince           |
+-----------------------------------------+--------------+-------------+----------+-----------+---------+--------------------+----------+----------+-----------+------+-------+-------------------+---------------------+---------------------+----------+--------+------------------------+------------------+---------------------+
| fe_a7c0b6d8_82c2_48f0_8220_fb65dd18be69 | 10.37.75.124 | 9010        | 8030     | 9030      | 9020    | 8070               | FOLLOWER | true     | 742250121 | true | true  | 2409              | 2025-11-11 14:42:16 | 2025-11-11 14:44:06 | true     |        | doris-0.0.0-009c3b552a | Yes              | 2025-11-11 14:42:16 |
+-----------------------------------------+--------------+-------------+----------+-----------+---------+--------------------+----------+----------+-----------+------+-------+-------------------+---------------------+---------------------+----------+--------+------------------------+------------------+---------------------+
1 row in set (0.016 sec)

MySQL [demo]> show catalog edoris;
+-----------------+-------------------------------+
| Key             | Value                         |
+-----------------+-------------------------------+
| create_time     | 2025-11-11 11:25:33.488106853 |
| fe_arrow_hosts  | 10.37.103.28:8070             |
| fe_http_hosts   | 10.37.103.28:8030             |
| fe_thrift_hosts | 10.37.103.28:9020             |
| password        | *XXX                          |
| type            | doris                         |
| use_meta_cache  | true                          |
| user            | test                          |
+-----------------+-------------------------------+
8 rows in set (0.002 sec)

MySQL [demo]> select * from inner_table;
+----------+--------+
| log_type | reason |
+----------+--------+
|        2 | inner2 |
|        3 | inner3 |
|        4 | inner4 |
+----------+--------+
3 rows in set (0.032 sec)

MySQL [demo]> select * from edoris.external.example_tbl_duplicate;
+---------------------+----------+------------+-----------+-------+---------------------+
| log_time            | log_type | error_code | error_msg | op_id | op_time             |
+---------------------+----------+------------+-----------+-------+---------------------+
| 2024-11-01 00:00:00 |        2 |          2 | timeout   |    12 | 2024-11-01 01:00:00 |
+---------------------+----------+------------+-----------+-------+---------------------+
1 row in set (0.059 sec)

MySQL [demo]> select * from inner_table a join edoris.external.example_tbl_duplicate b on (a.log_type = b.log_type);
+----------+--------+---------------------+----------+------------+-----------+-------+---------------------+
| log_type | reason | log_time            | log_type | error_code | error_msg | op_id | op_time             |
+----------+--------+---------------------+----------+------------+-----------+-------+---------------------+
|        2 | inner2 | 2024-11-01 00:00:00 |        2 |          2 | timeout   |    12 | 2024-11-01 01:00:00 |
+----------+--------+---------------------+----------+------------+-----------+-------+---------------------+
1 row in set (0.050 sec)

MySQL [demo]> explain select * from inner_table a join edoris.external.example_tbl_duplicate b on (a.log_type = b.log_type) where error_code=2;
+-------------------------------------------------------------------------------------------------------------------------------------------+
| Explain String(Nereids Planner)                                                                                                           |
+-------------------------------------------------------------------------------------------------------------------------------------------+
| PLAN FRAGMENT 0                                                                                                                           |
|   OUTPUT EXPRS:                                                                                                                           |
|     log_type[#16]                                                                                                                         |
|     reason[#17]                                                                                                                           |
|     log_time[#18]                                                                                                                         |
|     log_type[#19]                                                                                                                         |
|     error_code[#20]                                                                                                                       |
|     error_msg[apache#21]                                                                                                                        |
|     op_id[apache#22]                                                                                                                            |
|     op_time[apache#23]                                                                                                                          |
|   PARTITION: HASH_PARTITIONED: log_type[#6]                                                                                               |
|                                                                                                                                           |
|   HAS_COLO_PLAN_NODE: false                                                                                                               |
|                                                                                                                                           |
|   VRESULT SINK                                                                                                                            |
|      MYSQL_PROTOCOL                                                                                                                       |
|                                                                                                                                           |
|   3:VHASH JOIN(200)                                                                                                                       |
|   |  join op: INNER JOIN(BROADCAST)[]                                                                                                     |
|   |  equal join conjunct: (log_type[#6] = log_type[#1])                                                                                   |
|   |  cardinality=3                                                                                                                        |
|   |  vec output tuple id: 3                                                                                                               |
|   |  output tuple id: 3                                                                                                                   |
|   |  vIntermediate tuple ids: 2                                                                                                           |
|   |  hash output slot ids: 0 1 2 3 4 5 6 7                                                                                                |
|   |  runtime filters: RF000[min_max] <- log_type[#1](1/1/1048576), RF001[in_or_bloom] <- log_type[#1](1/1/1048576)                        |
|   |  final projections: log_type[#8], reason[#9], log_time[#10], log_type[#11], error_code[#12], error_msg[#13], op_id[#14], op_time[#15] |
|   |  final project output tuple id: 3                                                                                                     |
|   |  distribute expr lists: log_type[#6]                                                                                                  |
|   |  distribute expr lists:                                                                                                               |
|   |                                                                                                                                       |
|   |----1:VEXCHANGE                                                                                                                        |
|   |       offset: 0                                                                                                                       |
|   |       distribute expr lists: log_type[#1]                                                                                             |
|   |                                                                                                                                       |
|   2:VOlapScanNode(187)                                                                                                                    |
|      TABLE: demo.inner_table(inner_table), PREAGGREGATION: ON                                                                             |
|      partitions=1/1 (inner_table)                                                                                                         |
|      tablets=1/1, tabletList=1762832514491                                                                                                |
|      cardinality=3, avgRowSize=901.6666, numNodes=1                                                                                       |
|      pushAggOp=NONE                                                                                                                       |
|      runtime filters: RF000[min_max] -> log_type[#6], RF001[in_or_bloom] -> log_type[#6]                                                  |
|                                                                                                                                           |
| PLAN FRAGMENT 1                                                                                                                           |
|                                                                                                                                           |
|   PARTITION: HASH_PARTITIONED: log_type[#1]                                                                                               |
|                                                                                                                                           |
|   HAS_COLO_PLAN_NODE: false                                                                                                               |
|                                                                                                                                           |
|   STREAM DATA SINK                                                                                                                        |
|     EXCHANGE ID: 01                                                                                                                       |
|     UNPARTITIONED                                                                                                                         |
|                                                                                                                                           |
|   0:VOlapScanNode(188)                                                                                                                    |
|      TABLE: external.example_tbl_duplicate(example_tbl_duplicate), PREAGGREGATION: ON                                                     |
|      PREDICATES: (error_code[#2] = 2)                                                                                                     |
|      partitions=1/1 (example_tbl_duplicate)                                                                                               |
|      tablets=1/1, tabletList=1762481736238                                                                                                |
|      cardinality=1, avgRowSize=7425.0, numNodes=1                                                                                         |
|      pushAggOp=NONE                                                                                                                       |
|                                                                                                                                           |
|                                                                                                                                           |
|                                                                                                                                           |
| ========== STATISTICS ==========                                                                                                          |
| planed with unknown column statistics                                                                                                     |
+-------------------------------------------------------------------------------------------------------------------------------------------+
65 rows in set (0.040 sec)

```
hello-stephen pushed a commit that referenced this pull request Jan 7, 2026
…e#59098)

### What problem does this PR solve?

Introduced by apache#58905

==2037076==ERROR: AddressSanitizer: heap-use-after-free on address
0x7baaae908730 at pc 0x561b769a1fd0 bp 0x7b3caf4ebdf0 sp 0x7b3caf4ebde8
22:30:08  READ of size 1 at 0x7baaae908730 thread T12303 (rs_normal
[work)
22:30:08  #0 0x561b769a1fcf in doris::(anonymous
namespace)::string_compare(char const*, long, char const*, long, long)
/root/doris/be/src/vec/common/string_ref.h:170:29
22:30:08  #1 0x561b769a1fcf in
doris::StringRef::compare(doris::StringRef const&) const
/root/doris/be/src/vec/common/string_ref.h:259:30
22:30:08  #2 0x561b76f537cd in doris::StringRef::ge(doris::StringRef
const&) const /root/doris/be/src/vec/common/string_ref.h:282:52
22:30:08  #3 0x561b76f537cd in
doris::StringRef::operator>=(doris::StringRef const&) const
/root/doris/be/src/vec/common/string_ref.h:292:60
22:30:08  #4 0x561b76f537cd in bool
doris::Compare::greater_equal<doris::StringRef>(doris::StringRef const&,
doris::StringRef const&) /root/doris/be/src/common/compare.h:42:18
22:30:08  #5 0x561b76f537cd in
doris::ComparisonPredicateBase<(doris::PrimitiveType)23,
(doris::PredicateType)6>::camp_field(doris::vectorized::Field const&,
doris::vectorized::Field const&) const
/root/doris/be/src/olap/comparison_predicate.h:192:20
22:30:08  #6 0x561b76f4baa4 in
doris::ComparisonPredicateBase<(doris::PrimitiveType)23,
(doris::PredicateType)6>::evaluate_and(doris::vectorized::ParquetPredicate::ColumnStat*)
const /root/doris/be/src/olap/comparison_predicate.h:207:26
22:30:08  #7 0x561b76765284 in
doris::AndBlockColumnPredicate::evaluate_and(doris::vectorized::ParquetPredicate::ColumnStat*)
const /root/doris/be/src/olap/block_column_predicate.h:251:42
22:30:08  #8 0x561b89acd735 in
doris::vectorized::ParquetReader::_process_column_stat_filter(tparquet::RowGroup
const&, std::vector<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> >,
std::allocator<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> > > > const&,
bool*, bool*, bool*)
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:1225:25
22:30:08  #9 0x561b89ac8dd7 in
doris::vectorized::ParquetReader::_process_min_max_bloom_filter(doris::vectorized::RowGroupReader::RowGroupIndex
const&, tparquet::RowGroup const&,
std::vector<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> >,
std::allocator<std::unique_ptr<doris::MutilColumnBlockPredicate,
std::default_delete<doris::MutilColumnBlockPredicate> > > > const&,
doris::segment_v2::RowRanges*)
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:1108:9
22:30:08  #10 0x561b89ac3e73 in
doris::vectorized::ParquetReader::_next_row_group_reader()
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:718:9
22:30:08  #11 0x561b89ac008f in
doris::vectorized::ParquetReader::get_next_block(doris::vectorized::Block*,
unsigned long*, bool*)
/root/doris/be/src/vec/exec/format/parquet/vparquet_reader.cpp:607:21
22:30:08  #12 0x561b8a07c6f7 in
doris::vectorized::HiveReader::get_next_block_inner(doris::vectorized::Block*,
unsigned long*, bool*)
/root/doris/be/src/vec/exec/format/table/hive_reader.cpp:32:5
22:30:08  #13 0x561b89fee256 in
doris::vectorized::TableFormatReader::get_next_block(doris::vectorized::Block*,
unsigned long*, bool*)
/root/doris/be/src/vec/exec/format/table/table_format_reader.h:81:16
22:30:08  #14 0x561b89f71b97 in
doris::vectorized::FileScanner::_get_block_wrapped(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/file_scanner.cpp:472:13
22:30:08  #15 0x561b89f7086f in
doris::vectorized::FileScanner::_get_block_impl(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/file_scanner.cpp:409:17
22:30:08  #16 0x561b8a19f86e in
doris::vectorized::Scanner::get_block(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/scanner.cpp:109:17
22:30:08  #17 0x561b8a19f0a6 in
doris::vectorized::Scanner::get_block_after_projects(doris::RuntimeState*,
doris::vectorized::Block*, bool*)
/root/doris/be/src/vec/exec/scan/scanner.cpp:85:16
22:30:08  #18 0x561b8a1ccd0f in
doris::vectorized::ScannerScheduler::_scanner_scan(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:173:5
22:30:08  #19 0x561b8a1d6875 in
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()::operator()() const::'lambda'()::operator()() const
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:76:17
22:30:08  #20 0x561b8a1d6875 in
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()::operator()() const
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:75:27
22:30:08  apache#21 0x561b8a1d6875 in bool std::__invoke_impl<bool,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&>(std::__invoke_other,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&)
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/invoke.h:63:14
22:30:08  apache#22 0x561b8a1d6875 in std::enable_if<is_invocable_r_v<bool,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&>, bool>::type std::__invoke_r<bool,
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&>(doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()&)
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/invoke.h:116:9
22:30:08  apache#23 0x561b8a1d6875 in std::_Function_handler<bool (),
doris::vectorized::ScannerScheduler::submit(std::shared_ptr<doris::vectorized::ScannerContext>,
std::shared_ptr<doris::vectorized::ScanTask>)::$_0::operator()()
const::'lambda'()>::_M_invoke(std::_Any_data const&)
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:292:9
22:30:08  apache#24 0x561b8a1d5f07 in std::function<bool ()>::operator()()
const
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:593:9
22:30:08  apache#25 0x561b8a1d5f07 in
doris::vectorized::ScannerSplitRunner::process_for(std::chrono::duration<long,
std::ratio<1l, 1000000000l> >)
/root/doris/be/src/vec/exec/scan/scanner_scheduler.cpp:407:25
22:30:08  apache#26 0x561b8a2c56d4 in
doris::vectorized::PrioritizedSplitRunner::process()
/root/doris/be/src/vec/exec/executor/time_sharing/prioritized_split_runner.cpp:103:35
22:30:08  apache#27 0x561b8a29045c in
doris::vectorized::TimeSharingTaskExecutor::_dispatch_thread()
/root/doris/be/src/vec/exec/executor/time_sharing/time_sharing_task_executor.cpp:570:77
22:30:08  apache#28 0x561b7b9fecb6 in std::function<void ()>::operator()()
const
/usr/local/ldb-toolchain-v0.26/bin/../lib/gcc/x86_64-pc-linux-gnu/15/include/g++-v15/bits/std_function.h:593:9
22:30:08  apache#29 0x561b7b9fecb6 in doris::Thread::supervise_thread(void*)
/root/doris/be/src/util/thread.cpp:460:5
22:30:08  apache#30 0x561b76044d26 in asan_thread_start(void*)
(/mnt/ssd01/pipline/OpenSourceDoris/clusterEnv/P1/Cluster0/be/lib/doris_be+0x23962d26)
22:30:08  apache#31 0x7f4aaae68608 in start_thread
/build/glibc-SzIz7B/glibc-2.31/nptl/pthread_create.c:477:8
22:30:08  apache#32 0x7f4aaad7b132 in __clone
/build/glibc-SzIz7B/glibc-2.31/misc/../sysdeps/unix/sysv/linux/x86_64/clone.S:95
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants