Skip to content

Conversation

@kszucs
Copy link
Member

@kszucs kszucs commented Apr 13, 2018

The build itself fails too:

[info] Compiling 103 Scala sources and 6 Java sources to /apache-arrow/spark/streaming/target/scala-2.11/classes...
[error] Compile failed at Apr 13, 2018 9:50:50 AM [1:05.284s]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 17.349 s]
[INFO] Spark Project Tags ................................. SUCCESS [ 20.587 s]
[INFO] Spark Project Sketch ............................... SUCCESS [ 12.047 s]
[INFO] Spark Project Local DB ............................. SUCCESS [  8.086 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 18.759 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.423 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 19.453 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 17.220 s]
[INFO] Spark Project Core ................................. SUCCESS [12:40 min]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 32.734 s]
[INFO] Spark Project GraphX ............................... SUCCESS [01:02 min]
[INFO] Spark Project Streaming ............................ FAILURE [01:09 min]
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:29 min
[INFO] Finished at: 2018-04-13T09:50:50Z
[INFO] Final Memory: 59M/741M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: C
ompileFailed -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-streaming_2.11

Should I create a JIRA ticket?

@xhochy
Copy link
Member

xhochy commented Apr 21, 2018

cc @BryanCutler

@BryanCutler
Copy link
Member

Seems ok to me. I don't normally run Spark integration tests through Docker compose, what exactly is the advantage of doing that?

It looks like the first part of the issue stated in the JIRA was in building Arrow Cpp, could you clarify why the Spark build failed? I can't tell from the output above, but I'm not sure why this would cause the Spark Java build to fail.

@kszucs
Copy link
Member Author

kszucs commented Apr 26, 2018

Reproducibility I guess, but honestly I was just trying to execute dev/run_docker_compose.sh spark_integration and it has failed with the error in the JIRA ticket. This PR just resolves that permission error (which is indeed cpp related).

After I was able to run the command above, the streaming test has failed. I didn't do further investigation. I guess the execution of these integration tests (including dask and hdfs) should be automatized.

Copy link
Member

@BryanCutler BryanCutler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reproduced the permission problem when running from docker-compose, and the patch here fixed it.

After, I was able to run the integration tests fully (using my spark branch https://github.com/BryanCutler/spark/tree/arrow-upgrade-090 since there were api changes).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you mind changing the arguments in the docker-compose command to pass all and not just the first? So change "${1}" to "${@}". This will allow to run it like the following to map the local Maven repo and avoid downloading all the artifacts to the image if they have already been downloaded locally:

run_docker_compose.sh -v $HOME/.m2/:/root/.m2 spark_integration

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure!

@kszucs
Copy link
Member Author

kszucs commented Apr 27, 2018

@BryanCutler Should we create a ticket about running nightly integration tests?

Copy link
Member

@BryanCutler BryanCutler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Afaik, CI doesn't run this script so failure is something else.

@BryanCutler
Copy link
Member

Should we create a ticket about running nightly integration tests?

Yes please do, that would be great to get going

@BryanCutler
Copy link
Member

merged to master, thanks @kszucs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants