Remove invalidation from Global.scala#3884
Conversation
|
The PR/commit message is a bit short on rationale why it's safe to remove this functionality. It was added originally to support an exeriment with resident compilation. The experiment was performed in sbt and dropped in sbt/sbt@6def08e. Since then we concluded to not work on resident compilation so it's safe to drop unused code supporting it. Once all tests pass we should include this explanation in the commit message. |
|
That's too bad. I had versions of partest and REPL that used it, for test isolation and the colon cp command, respectively. Both tools are somewhat "resident" by nature. That's from a while ago, but I thought eventually it would get revived. I suppose that if it's workable, it can be reworked; and there was also the promise of a new classpath mechanism for the compiler anyway. |
|
The motivation for this PR was to make it easier to bring alternative classpath implementation. @mpociecha is looking into that. We can also bring back the minimal functionality later on that works with the new classpath implementation. @mpociecha, since tests passed, can you edit commit message to include the background information including the commits mentioned in this PR? We should document why we think this code is safe to be removed. |
The invalidation has been introduced in these commits: scala@167309a scala@ace051f scala@e156d4a It's safe to remove this functionality. It was added originally to support an experiment with resident compilation. The experiment was performed in sbt and dropped in sbt/sbt@6def08e Since then Scala team concluded to not work on resident compilation so it's safe to delete unused code.
|
LGTM Thanks! |
Remove invalidation from Global.scala
|
I'll reiterate my comment from SI-6502. So, is there a recommended way to append to the compiler's classpath and get the entries reloaded (in case the jar has been updated)? For commands like :cp, throwing out the interpreter and replaying commands is not an ideal solution, especially not in Spark's case. And this still doesn't work in the Scala REPL as of 2.11.2, not just the 2.10.x branch: |
|
@mpociecha @gkossakowski @rcsenkbeil we just added a dependency in ENSIME on this functionality, are you really sure it's not used anywhere? Will this break our codebase when 2.11.3 comes out? (it certainly looks like it will unless you're just moving things around) We're using it to allow us to add the compiled classes of a project onto the presentation compiler classpath, so that only the user's active source files need to be loaded... resulting in dramatic performance improvements. ensime/ensime-server@3ef9136#diff-deb75fcaf9fcae72050f24a45f6f32bcR192 What is the alternative? BTW, if you know this code well then perhaps you can also advise on how we could do this for the 2.9 presentation compiler. @edani were you aware that this was being removed? |
|
Ok, people use invalidation so maybe would it be better to revert these changes? As long as there won't be the invalidation working for flat classpath, I can just add checks, which classpath implementation is used, and use the invalidation only in the case of the old one. In the case of flat classpath I could throw UnsupportedOperationException or something like that. |
|
@mpociecha thanks, putting it back in again would be good especially for any bugfix releases of 2.11. Can you please add ensime-server to your list of integration test projects to avoid this kind of problem in the future? |
|
We've just opened up our community build to contributions to enable a broader set of integration testing. Could you either open a pull-request or an issue against that repository so we include ensime and avoid accidental breakage (at least in minor releases)? |
|
@retronym thanks we'll do that. @gkossakowski since this didn't work out in zinc, can you please advise if there is a better way for us to do this in ENSIME/presentation compiler? The problem we are trying to solve is to minimise the number of open files in the presentation compiler by leveraging existing class files for the rest of the project (e.g. output by the continuous compilation from a zinc server). |
|
I wasn't aware that this was being removed either. I'm also curious how other Scala IDEs handle changes that come from rebuilding a project... maybe they just instantiate a new presentation compiler? |
|
@fommil, it wasn't used in any project we knew about at the time of merging this PR. Regarding leveraging class files, would restarting presentation compiler instance be an option? Given that Scala compilation is rather slow, restarting presentation compiler shouldn't add significant portion to the total turnaround time. |
|
It looks like there is very little penalty with restarting the presentation compiler so we'll go with that in ENSIME, |
|
Excellent! Thanks for letting us know! |
….10) Fixes SI-6502, reenables loading jars into the running REPL (regression in 2.10). This PR allows adding a jar to the compile and runtime classpaths without resetting the REPL state (crucial for Spark SPARK-3257). This follows the lead taken by @som-snytt in PR scala#3986, which differentiates two jar-loading behaviors (muddled by cp): - adding jars and replaying REPL expressions (using replay) - adding jars without resetting the REPL (deprecated cp, introduced require) This PR implements require (left unimplemented in scala#3986) This PR is a simplification of a similar approach taken by @gkossakowski in scala#3884. In this attempt, we check first to make sure that a jar is only added if it only contains new classes/traits/objects, otherwise we emit an error. This differs from the old invalidation approach which also tracked deleted classpath entries.
….10) Fixes SI-6502, reenables loading jars into the running REPL (regression in 2.10). This PR allows adding a jar to the compile and runtime classpaths without resetting the REPL state (crucial for Spark SPARK-3257). This follows the lead taken by @som-snytt in PR scala#3986, which differentiates two jar-loading behaviors (muddled by cp): - adding jars and replaying REPL expressions (using replay) - adding jars without resetting the REPL (deprecated cp, introduced require) This PR implements require (left unimplemented in scala#3986) This PR is a simplification of a similar approach taken by @gkossakowski in scala#3884. In this attempt, we check first to make sure that a jar is only added if it only contains new classes/traits/objects, otherwise we emit an error. This differs from the old invalidation approach which also tracked deleted classpath entries.
….10) Fixes SI-6502, reenables loading jars into the running REPL (regression in 2.10). This PR allows adding a jar to the compile and runtime classpaths without resetting the REPL state (crucial for Spark SPARK-3257). This follows the lead taken by @som-snytt in PR scala#3986, which differentiates two jar-loading behaviors (muddled by cp): - adding jars and replaying REPL expressions (using replay) - adding jars without resetting the REPL (deprecated cp, introduced require) This PR implements require (left unimplemented in scala#3986) This PR is a simplification of a similar approach taken by @gkossakowski in scala#3884. In this attempt, we check first to make sure that a jar is only added if it only contains new classes/traits/objects, otherwise we emit an error. This differs from the old invalidation approach which also tracked deleted classpath entries.

@gkossakowski stated that this code is unnecessary and probably it still exists only by accident
It has to be taken into account during making changes so it's better to remove this code, if it's really unnecessary..