-
Notifications
You must be signed in to change notification settings - Fork 4.1k
[CI] Spark integration tests are failing ("org.apache.arrow.flatbuf.Message not found") #40549
Copy link
Copy link
Closed
Description
Failures from last night:
- test-conda-python-3.10-spark-v3.5.0
- test-conda-python-3.11-spark-master
- test-conda-python-3.8-spark-v3.5.0
Error: ] /spark/sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowConverters.scala:26: object flatbuf is not a member of package org.apache.arrow
Error: ] /spark/sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowConverters.scala:456: Class org.apache.arrow.flatbuf.Message not found - continuing with a stub.
Error: [ERROR] two errors found
Error: Failed to execute goal net.alchim31.maven:scala-maven-plugin:4.8.0:compile (scala-compile-first) on project spark-sql_2.12: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:4.8.0:compile failed: org.apache.commons.exec.ExecuteException: Process exited with an error: 255 (Exit value: 255) -> [Help 1]
Error:
Error: To see the full stack trace of the errors, re-run Maven with the -e switch.
Error: Re-run Maven using the -X switch to enable full debug logging.
Error:
Error: For more information about the errors and possible solutions, please read the following articles:
Error: [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
Error:
Error: After correcting the problems, you can resume the build with the command
Error: mvn <args> -rf :spark-sql_2.12
1
Error: `docker-compose --file /home/runner/work/crossbow/crossbow/arrow/docker-compose.yml run --rm -e SETUPTOOLS_SCM_PRETEND_VERSION=16.0.0.dev283 conda-python-spark` exited with a non-zero exit code 1, see the process log above.
Reactions are currently unavailable