-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-36796][BUILD][CORE][SQL] Pass all sql/core and dependent modules UTs with JDK 17 except one case in postgreSQL/text.sql
#34153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
sql/core and dependent modules UTs with JDK 17sql/core and dependent modules UTs with JDK 17
|
Test build #143745 has finished for PR 34153 at commit
|
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Test build #143746 has finished for PR 34153 at commit
|
sql/core and dependent modules UTs with JDK 17sql/core and dependent modules UTs with JDK 17 and Maven
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Test build #143747 has finished for PR 34153 at commit
|
| "--add-opens java.base/java.net=ALL-UNNAMED", | ||
| "--add-opens java.base/java.io=ALL-UNNAMED", | ||
| "--add-opens java.base/java.util.concurrent=ALL-UNNAMED", | ||
| "--add-exports java.base/jdk.internal.util.random=ALL-UNNAMED") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you, @LuciferYang !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, I didn't realized that we need --add-exports. Could you explain a little more about this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
--add-exports is useless for the current pr , already remove it.
But It is needed in modules mllib-local and mllib, some code in test like mock[java.util.Random] need this --add-exports
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of adding jdk-17 profile, you can add like the following always. Last time, I learned that --add-opens= do the tricks on the JIRA you mentioned your idea.
-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.nio=ALL-UNNAMED ...
|
|
||
| object JavaModuleUtils { | ||
|
|
||
| private val javaModuleOptions = Set("--add-opens java.base/java.nio=ALL-UNNAMED", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's use = instead of space like --add-opens=java.base/java.nio=ALL-UNNAMED.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Test build #143748 has finished for PR 34153 at commit
|
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Kubernetes integration test starting |
|
Kubernetes integration test starting |
sql/core and dependent modules UTs with JDK 17 and Mavensql/core and dependent modules UTs use Maven with JDK 17
sql/core and dependent modules UTs use Maven with JDK 17sql/core and dependent modules UTs with JDK 17
sql/core and dependent modules UTs with JDK 17sql/core and dependent modules UTs with JDK 17 except one case in postgreSQL/text.sql
I created a new Jira SPARK-37013 and #34153 try to fix the different string format behavior between Java 8 and Java 17, the relevant code has been removed from this pr. At present, this pr only adds the default |
|
If this pr is merged, we can continue to verify other modules |
|
Kubernetes integration test starting |
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Kubernetes integration test status failure |
|
Test build #144383 has finished for PR 34153 at commit
|
|
Test build #144384 has finished for PR 34153 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Test build #144396 has finished for PR 34153 at commit
|
|
Merged to master |
|
Thank you so much, @LuciferYang and @srowen and all! |
|
thanks all ~ |
…s` to drivers ### What changes were proposed in this pull request? This PR aims to make `StandaloneRestServer` add `JavaModuleOptions` to drivers by default. ### Why are the changes needed? Since Apache Spark 3.3.0 (SPARK-36796, #34153), `SparkContext` adds `JavaModuleOptions` by default. We had better add `JavaModuleOptions` when `StandaloneRestServer` receives submissions via REST API, too. Otherwise, it fails like the following if the users don't set it manually. **SUBMISSION** ```bash $ SPARK_MASTER_OPTS="-Dspark.master.rest.enabled=true" sbin/start-master.sh $ curl -s -k -XPOST http://yourserver:6066/v1/submissions/create \ --header "Content-Type:application/json;charset=UTF-8" \ --data '{ "appResource": "", "sparkProperties": { "spark.master": "local[2]", "spark.app.name": "Test 1", "spark.submit.deployMode": "cluster", "spark.jars": "/Users/dongjoon/APACHE/spark-release/spark-3.5.0-bin-hadoop3/examples/jars/spark-examples_2.12-3.5.0.jar" }, "clientSparkVersion": "", "mainClass": "org.apache.spark.examples.SparkPi", "environmentVariables": {}, "action": "CreateSubmissionRequest", "appArgs": [] }' ``` **DRIVER `stderr` LOG** ``` Exception in thread "main" java.lang.reflect.InvocationTargetException ... at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala) Caused by: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module 0x6d7a93c9) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module 0x6d7a93c9 ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass the CIs with the newly added test case. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #42975 from dongjoon-hyun/SPARK-45197. Authored-by: Dongjoon Hyun <dhyun@apple.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
…s` to drivers ### What changes were proposed in this pull request? This PR aims to make `StandaloneRestServer` add `JavaModuleOptions` to drivers by default. ### Why are the changes needed? Since Apache Spark 3.3.0 (SPARK-36796, apache#34153), `SparkContext` adds `JavaModuleOptions` by default. We had better add `JavaModuleOptions` when `StandaloneRestServer` receives submissions via REST API, too. Otherwise, it fails like the following if the users don't set it manually. **SUBMISSION** ```bash $ SPARK_MASTER_OPTS="-Dspark.master.rest.enabled=true" sbin/start-master.sh $ curl -s -k -XPOST http://yourserver:6066/v1/submissions/create \ --header "Content-Type:application/json;charset=UTF-8" \ --data '{ "appResource": "", "sparkProperties": { "spark.master": "local[2]", "spark.app.name": "Test 1", "spark.submit.deployMode": "cluster", "spark.jars": "/Users/dongjoon/APACHE/spark-release/spark-3.5.0-bin-hadoop3/examples/jars/spark-examples_2.12-3.5.0.jar" }, "clientSparkVersion": "", "mainClass": "org.apache.spark.examples.SparkPi", "environmentVariables": {}, "action": "CreateSubmissionRequest", "appArgs": [] }' ``` **DRIVER `stderr` LOG** ``` Exception in thread "main" java.lang.reflect.InvocationTargetException ... at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala) Caused by: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module 0x6d7a93c9) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module 0x6d7a93c9 ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass the CIs with the newly added test case. ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#42975 from dongjoon-hyun/SPARK-45197. Authored-by: Dongjoon Hyun <dhyun@apple.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
|
Hi guys, is there today a spark version that fully supports java17, im trying to run locally a project that uses spark an java 17, scala version 2.12.15, spark 3.3.4 and is failing because trying to access internal java APIs. It works when adding:
|
|
(This isn't the place for general questions, but:) |
|
Hi @srowen thanks for replying, will stick into that. Have a good day. |
What changes were proposed in this pull request?
In order to pass the UTs related to sql/core and dependent modules, this pr mainly does the following change:
extraJavaTestArgstopom.xml, It includes all--add-opensconfigurations required for UTs with Java 17 and It also include-XX:+IgnoreUnrecognizedVMOptionsto compatible with Java 8.JavaModuleOptionsto place constant and methods related to--add-opens--add-opensoptions and-XX:+IgnoreUnrecognizedVMOptionsis added by default whenSparkSubmitCommandBuilderbuilds the submit command, this can helplocal-clusterrelated UTs pass and help users reduce the cost of using java 17--add-opensoptions and-XX:+IgnoreUnrecognizedVMOptionsis supplemented tospark.driver.extraJavaOptionsandspark.executor.extraJavaOptionswhen init SparkContext, this also can help UTs pass and help users reduce the cost of using java 17Why are the changes needed?
Pass Spark UTs with JDK 17
Does this PR introduce any user-facing change?
No
How was this patch tested?
mvn clean install -pl sql/core -amwith Java 17, onlyselect format_string('%0$s', 'Hello')inpostgreSQL/text.sqlfailed due to the different behavior of Java 8 and Java 17