-
Notifications
You must be signed in to change notification settings - Fork 4k
ARROW-2452: [TEST] Spark integration test fails with permission error #1890
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @BryanCutler |
|
Seems ok to me. I don't normally run Spark integration tests through Docker compose, what exactly is the advantage of doing that? It looks like the first part of the issue stated in the JIRA was in building Arrow Cpp, could you clarify why the Spark build failed? I can't tell from the output above, but I'm not sure why this would cause the Spark Java build to fail. |
|
Reproducibility I guess, but honestly I was just trying to execute After I was able to run the command above, the streaming test has failed. I didn't do further investigation. I guess the execution of these integration tests (including dask and hdfs) should be automatized. |
BryanCutler
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reproduced the permission problem when running from docker-compose, and the patch here fixed it.
After, I was able to run the integration tests fully (using my spark branch https://github.com/BryanCutler/spark/tree/arrow-upgrade-090 since there were api changes).
dev/run_docker_compose.sh
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you mind changing the arguments in the docker-compose command to pass all and not just the first? So change "${1}" to "${@}". This will allow to run it like the following to map the local Maven repo and avoid downloading all the artifacts to the image if they have already been downloaded locally:
run_docker_compose.sh -v $HOME/.m2/:/root/.m2 spark_integration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure!
|
@BryanCutler Should we create a ticket about running nightly integration tests? |
BryanCutler
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Afaik, CI doesn't run this script so failure is something else.
Yes please do, that would be great to get going |
|
merged to master, thanks @kszucs |
The build itself fails too:
Should I create a JIRA ticket?