[SPARK-47645][BUILD][CORE][SQL][YARN] Make Spark build with -release instead of -target#45716
[SPARK-47645][BUILD][CORE][SQL][YARN] Make Spark build with -release instead of -target#45716LuciferYang wants to merge 14 commits intoapache:masterfrom
-release instead of -target#45716Conversation
.github/workflows/build_and_test.yml
Outdated
| # export JAVA_VERSION=${{ matrix.java }} | ||
| # It uses Maven's 'install' intentionally, see https://github.com/apache/spark/pull/26414. | ||
| ./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Djava.version=${JAVA_VERSION/-ea} install | ||
| ./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud install |
There was a problem hiding this comment.
Mainly used to verify the cross-compilation scenario of Maven, using Java 21 to compile with -release 17
This comment was marked as outdated.
This comment was marked as outdated.
| val timeZoneOffset = TimeZone.getDefault match { | ||
| case zoneInfo: ZoneInfo => zoneInfo.getOffsetsByWall(localMillis, null) | ||
| case timeZone: TimeZone => timeZone.getOffset(localMillis - timeZone.getRawOffset) | ||
| case zoneInfo: TimeZone if zoneInfo.getClass.getName == zoneInfoClassName => |
There was a problem hiding this comment.
for fix:
Error: ] /home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:27: object util is not a member of package sun
Error: ] /home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218: not found: type ZoneInfo
Error: ] /home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218: value getOffsetsByWall is not a member of java.util.TimeZone
Maybe we can just use val timeZoneOffset = TimeZone.getDefault.getOffset(localMillis) , but I'm not sure which case can verify the compatibility issue with 2.4
There was a problem hiding this comment.
Let me investigate whether there are other open APIs that can be used as a substitute here.
| val decoder = Charset.forName(enc).newDecoder() | ||
|
|
||
| StreamDecoder.forDecoder(byteChannel, decoder, decodingBufferSize) | ||
| Channels.newReader(byteChannel, decoder, decodingBufferSize) |
There was a problem hiding this comment.
for fix:
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:27: object nio is not a member of package sun
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:61: not found: type StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:67: not found: value StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/CreateJacksonParser.scala:27: Unused import
| val decoder = Charset.forName(enc).newDecoder() | ||
|
|
||
| StreamDecoder.forDecoder(byteChannel, decoder, decodingBufferSize) | ||
| Channels.newReader(byteChannel, decoder, decodingBufferSize) |
There was a problem hiding this comment.
for fix:
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:27: object nio is not a member of package sun
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:78: not found: type StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:84: not found: value StreamDecoder
Error: ] /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/xml/CreateXmlParser.scala:27: Unused import
| package org.apache.spark.deploy.yarn | ||
|
|
||
| import java.io.{FileSystem => _, _} | ||
| import java.io.{File, FileFilter, FileNotFoundException, FileOutputStream, InterruptedIOException, IOException, OutputStreamWriter} |
There was a problem hiding this comment.
fix
Error: ] /home/runner/work/spark/spark/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:20: object FileSystem is not a member of package java.io
-release 17
| val mh = lookup.unreflectConstructor(constructor) | ||
| val action = mh.invoke("sun.io.serialization.extendedDebugInfo") | ||
| .asInstanceOf[PrivilegedAction[Boolean]] | ||
| !AccessController.doPrivileged(action).booleanValue() |
There was a problem hiding this comment.
Use MethodHandle to ensure logical consistency. Considering that sun.security is not an exported package, we can also directly use !JBoolean.getBoolean("sun.io.serialization.extendedDebugInfo").
-release 17-release 17
-release 17-release 17
-release 17 -release 17
-release 17 -release instead of -target
| <scalatest-maven-plugin.version>2.2.0</scalatest-maven-plugin.version> | ||
| <!-- don't upgrade scala-maven-plugin to version 4.7.2 or higher, see SPARK-45144 for details --> | ||
| <scala-maven-plugin.version>4.7.1</scala-maven-plugin.version> | ||
| <scala-maven-plugin.version>4.8.1</scala-maven-plugin.version> |
There was a problem hiding this comment.
To fix Error: -release cannot be less than -target, it seems necessary to upgrade to use 4.8.1
-release instead of -target-release instead of -target
|
|
Could you make the CI happy, @LuciferYang ? |
re-triggered the failed task, let's see if it can pass |
dongjoon-hyun
left a comment
There was a problem hiding this comment.
+1, LGTM.
cc @srowen , @JoshRosen , @HyukjinKwon , too.
|
Merged to master. Thank you, @LuciferYang and all. |
|
Thanks @dongjoon-hyun ~ |
|
This PR seems to break local environment for maven in IntelliJ. Couple of us are getting this response when trying to run anything using maven, even after rebuilding. |
|
@mihailom-db Could you provide a specific build command? The maven compilation in the GA test has passed. |
|
This seems to have solved the problem. Thanks. |
### What changes were proposed in this pull request? This PR aims to remove `ignore.symbol.file` Javac option. ### Why are the changes needed? Since Apache Spark 4.0.0, we are building with `-release` which means `Javac` always uses `ct.sym`, a stripped-down version of the standard library (like rt.jar in older JDKs) containing only the necessary class stubs and signature data for the documented APIs of a specific Java release. In other words, `ignore.symbol.file` option is ignored now and we can remove it safely. - #45716 ### Does this PR introduce _any_ user-facing change? No behavior change. ### How was this patch tested? Pass the CIs. **BEFORE** ``` $ git grep ignore.symbol.file | wc -l 4 ``` **AFTER** ``` $ git grep ignore.symbol.file | wc -l 0 ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #52802 from dongjoon-hyun/SPARK-54100. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This PR aims to remove `ignore.symbol.file` Javac option. ### Why are the changes needed? Since Apache Spark 4.0.0, we are building with `-release` which means `Javac` always uses `ct.sym`, a stripped-down version of the standard library (like rt.jar in older JDKs) containing only the necessary class stubs and signature data for the documented APIs of a specific Java release. In other words, `ignore.symbol.file` option is ignored now and we can remove it safely. - apache#45716 ### Does this PR introduce _any_ user-facing change? No behavior change. ### How was this patch tested? Pass the CIs. **BEFORE** ``` $ git grep ignore.symbol.file | wc -l 4 ``` **AFTER** ``` $ git grep ignore.symbol.file | wc -l 0 ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes apache#52802 from dongjoon-hyun/SPARK-54100. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>

What changes were proposed in this pull request?
This pr makes the following changes to allow Spark to build with
-releaseinstead of-target:Use
MethodHandleinstead of direct calls tosun.security.action.GetBooleanActionandsun.util.calendar.ZoneInfo, because they are notexportsAPIs.Channels.newReaderis used instead of ``,StreamDecoder.forDecoder becauseStreamDecoder.forDecoderis also not `exports` APIs.java.io._inyarn/Client.scalato fix the compilation error:Replaced
-targetwith-releaseinpom.xmlandSparkBuild.scala, and removed the-sourceoption, because using-releaseis sufficient.Upgrade
scala-maven-pluginfrom 4.7.1 to 4.8.1 to fix the error[ERROR] -release cannot be less than -targetwhen executingbuild/mvn clean install -DskipTests -Djava.version=21Why are the changes needed?
After Scala 2.13.9, the compile option
-targethas been deprecated, it is recommended to use-release:-releasemore useful, deprecate-target, align with Scala 3 scala/scala#9982Does this PR introduce any user-facing change?
No
How was this patch tested?
Pass GitHub Actions
Was this patch authored or co-authored using generative AI tooling?
No