Support Java 17 sealed in Java sources#10348
Conversation
f2a167c to
8751aeb
Compare
8751aeb to
eeac512
Compare
|
The updated commit just "registers" Java classes and matches for completion after typer, where (per lrytz) either there is a permits or we look at classes from the same source file. The simple impl could be improved or complexified further. Or simplified. |
eeac512 to
35c0384
Compare
|
Thanks, great to see that we can make it work! Why do you think we should delay until after typer? Forcing is what the typer does, I don't see a particular risk of running into cycles here, we don't need to infer anything to complete these class symbols. For finding symbols defined in a particular compilation unit, I also think we need new global state, there doesn't seem to be anything we can reuse. A question is where to put it, I guess it doesn't matter too much. Here's a draft, what do you think (last commit)? 2.13.x...lrytz:scala:pr10348 |
|
Yes, that is what I was thinking of (a perRunCaches Map). I thought keeping the logic in one place was simpler, "children we know about" and "other offspring". Oh I see what you mean, we've already seen all the classes for this unit, nothing is gained by waiting. The other piece I started looking at was emitting the permittedSubclasses attribute for Scala sealed. I'll try to add a commit for that, I was about to imitate some structure for innerClasses. A final clean-up is also to review the tests for duplication or completeness. Also, this is a patmat feature, so I like that it looks that way by putting the unwieldy sources map there. |
|
I added your commits and will push after cleaning up my detritus. Also I haven't touched the java parser to look more like your earlier edit, which I remember preferring. |
35c0384 to
a646610
Compare
| val pluginsTp = pluginsTypeSig(res, typer, cdef, WildcardType) | ||
| cdef.getAndRemoveAttachment[PermittedSubclasses].foreach { permitted => | ||
| clazz.updateAttachment[PermittedSubclassSymbols] { | ||
| PermittedSubclassSymbols(permitted.permits.map(typer.typed(_, Mode.NOmode).symbol)) |
There was a problem hiding this comment.
what happens to errors on bad permits clause here?
lrytz
left a comment
There was a problem hiding this comment.
We forgot about inner classes...
package p;
public sealed interface X {
public default int x() { return 1; }
}
final class Y implements X { }
final class O {
final static class Z implements X { }
}
package p
class T {
def n(a: X) = a match { case _: Y => 0 }
}
Should warn but doesn't in mixed compilation. Namer eagerly greates symbols for top-level classes. But symbols for inner classes are only created when the outer class completes.
So if there's a sealed Java type without a permits clause, we probably have to recursively force all classes defined in the same compilation unit.
X.java:9: error: anonymous classes must not extend sealed classes
X bar = new X() { };
^
😅 OK, at least that
@retronym brought runtime reflection to the table:
scala> import scala.reflect.runtime.{universe => ru}
import scala.reflect.runtime.{universe=>ru}
scala> ru.typeTag[p.X].tpe.typeSymbol.asClass.knownDirectSubclasses
val res0: Set[reflect.runtime.universe.Symbol] = Set()
Also macros / compiler plugins might want to see children, not only the exhaustivity checker. But I think it's fine to defer that.
sealed in Java sources [ci: last-only]
2bf8131 to
62c5acf
Compare
sealed in Java sources [ci: last-only]sealed in Java sources
|
@lrytz this will be green after squashing. I guess "zucchini", as a green squash. I'll take another look at the tweak to populating the sealed class hierarchy. It needs tests with |
Required for checking permitted subclasses and for exhaustiveness of patterns. Co-authored-by: Lukas Rytz <lukas.rytz@gmail.com>
9c3e87d to
a0f546d
Compare
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes #41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes apache#41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist. - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3 ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test ### Was this patch authored or co-authored using generative AI tooling? No Closes #42918 from LuciferYang/SPARK-40497-2. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Support
sealedin Java sources. The related PR supports separate compilation, wherePermittedSubclassesattribute specifies sealedness; there seems to be no ticket for that, alas. This PR is Java interop gravy.Fixes scala/bug#12171
Fixes scala/bug#12159