Skip to content

[MINOR][Testing][DNM] Release 0.15.0 test bundle validation #24

[MINOR][Testing][DNM] Release 0.15.0 test bundle validation

[MINOR][Testing][DNM] Release 0.15.0 test bundle validation #24

Re-run triggered May 27, 2024 00:25
Status Failure
Total duration 4m 21s
Artifacts
Matrix: validate-release-candidate-bundles
Fit to window
Zoom out
Zoom in

Annotations

26 errors and 110 warnings
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
Unexpected exception. This is a bug. Please consider filing an issue.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh Flink bundle validation failed.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
Process completed with exit code 1.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
Unexpected exception. This is a bug. Please consider filing an issue.
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
Unexpected exception. This is a bug. Please consider filing an issue.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
The job was canceled because "scala-2_12_flink1_15_spar" failed.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/setup-java@v3. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.2, spark3.2.3)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.4, spark3.4.0)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark2.4, spark2.4.8)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.1, spark3.1.3)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.0, spark3.0.2)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh running deltastreamer
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.11, flink1.14, spark, spark2.4.8)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3, spark3.5.0)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.12, flink1.18, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.0)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle