Skip to content

Create release branch for version 0.13.1 #14

Create release branch for version 0.13.1

Create release branch for version 0.13.1 #14

Triggered via push July 24, 2023 02:18
Status Success
Total duration 3h 40m 5s
Artifacts

bot.yml

on: push
validate-source
45s
validate-source
Matrix: test-spark
Matrix: validate-bundles
Fit to window
Zoom out
Zoom in

Annotations

6 errors and 114 warnings
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
Error closing metadata file system view.
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
Error closing metadata file system view.
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Error closing metadata file system view.
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Error closing metadata file system view.
test-spark (scala-2.12, spark3.2, hudi-spark-datasource/hudi-spark3.2.x)
Error closing metadata file system view.
test-spark (scala-2.12, spark3.2, hudi-spark-datasource/hudi-spark3.2.x)
Error closing metadata file system view.
test-flink (flink1.14)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-flink (flink1.15)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-flink (flink1.16)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-flink (flink1.13)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-spark (scala-2.12, spark2.4, hudi-spark-datasource/hudi-spark2)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-spark (scala-2.11, spark2.4, hudi-spark-datasource/hudi-spark2)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
validate-source
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
validate-bundles (flink1.13, spark3.1, spark3.1.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh validating utilities bundle
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh running deltastreamer
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh done with deltastreamer
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh validating utilities bundle
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh running deltastreamer
validate-bundles (flink1.13, spark3.1, spark3.1.3)
validate.sh done with deltastreamer
validate-bundles (flink1.14, spark3.2, spark3.2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh validating utilities slim bundle
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh running deltastreamer
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh validating utilities slim bundle
validate-bundles (flink1.14, spark3.2, spark3.2.3)
validate.sh running deltastreamer
validate-bundles (flink1.15, spark3.3, spark3.3.1)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh running deltastreamer
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh running deltastreamer
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
validate-bundles (flink1.15, spark3.3, spark3.3.1)
validate.sh running deltastreamer
validate-bundles (flink1.16, spark3.3, spark3.3.2)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh running deltastreamer
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh running deltastreamer
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
validate-bundles (flink1.16, spark3.3, spark3.3.2)
validate.sh running deltastreamer
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
test-spark (scala-2.12, spark3.2, hudi-spark-datasource/hudi-spark3.2.x)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/setup-java@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/