Do not persist winforms data to disk#46391
Conversation
| { | ||
| _workQueue = new AsyncBatchingWorkQueue<DesignerAttributeData>( | ||
| TimeSpan.FromSeconds(1), | ||
| TimeSpan.FromMilliseconds(500), |
There was a problem hiding this comment.
what is the rationale for this change?
There was a problem hiding this comment.
i actually reduce it even more in the next PR. the point of these delays is to allow us to hear about a large set of result from oop at once, and then report them at once to VS in one go. i.e. we don't want to hear about 100 files, and then send 100 calls to update VS (each which might hit the UI thread). Instead, it's better to hear about 100 files, then report it in one go to VS on the UI thread once.
So the goal is simply to give enough time to hear about a lot, but not so long that it feels slow to the user for thigns to update. 250ms seems to work well. It allows for hearing about tons of files, not saturating the UI thread, and providing low latency between hearing about changes and updating the UI with them. Note: my next PR adds a comment on this.
There was a problem hiding this comment.
"250ms seems to work well."
The PR is changing it to 500ms though?
There was a problem hiding this comment.
next pr reduces it further :)
jasonmalinowski
left a comment
There was a problem hiding this comment.
Looks OK and I'll never complain about the removal of the a bunch of storage stuff. A bit worried about the use of ConcurrentDictionary in a place or two where if various members here can be running in parallel strange things might happen.
| } | ||
|
|
||
| public override Task RemoveProjectAsync(ProjectId projectId, CancellationToken cancellationToken) | ||
| public override async Task RemoveProjectAsync(ProjectId projectId, CancellationToken cancellationToken) |
There was a problem hiding this comment.
Does this also get called if we unload the entire solution? I'm guessing we'll see a project removed notification for each project now but would be good to confirm under a debugger. I know the live unit testing code was running into some 'surprises' here and I wouldn't want us to fall into the same trap.
There was a problem hiding this comment.
@mavasani for information here. IN the case of solution-unload, how to incremetnal analyzers hear about the 'removal'? Thanks!
There was a problem hiding this comment.
Current implementation invokes RemoveDocumentAsync for each document in the solution: http://sourceroslyn.io/#Microsoft.CodeAnalysis.Features/SolutionCrawler/WorkCoordinator.cs,407. It seems it can certainly do batching here and just invoke RemoveProjectAsync for all projects in the solution.
There was a problem hiding this comment.
on interesting. so there's no real point in responding to RemoveProject if i'm also responding to RemoveDocument, correct?
There was a problem hiding this comment.
We invoke RemoveProject for enqueued project items for which solution.GetProject(id) returns null, i.e. project has been removed and we cannot find the documents for it.
There was a problem hiding this comment.
| var changedData = | ||
| latestData.Where(d => !_documentToLastReportedInformation.TryGetValue(d.document.Id, out var existingInfo) || existingInfo.category != d.data.Category) | ||
| .ToImmutableArray(); |
There was a problem hiding this comment.
If the user had a file that we classified, and then edits the code such that ComputeLatestDataAsync no longer returns anything for that file, do we need to report the now lack of data? And clear out the dictionary in that case?
There was a problem hiding this comment.
we do report the lack of data. specifically, we report changes from whatever we last reported. We don't actually clear out the dictionary because we want to store in teh dictionary the version of the last report, taht way we can ignore future events (that don't chanage hte state). If we remove things from teh dictionary, that is akin to saying "we've never reported anything", and then we will just keep notifying VS of "it's not designable, it's not designable, it's not designable".
There was a problem hiding this comment.
Ah got it; all documents being queried still get added to the dictionary no matter what. Makes sense.
|
|
||
| tasks.Add(ComputeDesignerAttributeDataAsync( | ||
| storage, projectVersion, designerCategoryType, document, cancellationToken)); | ||
| // If we don't have a path for this document, we cant proceed with it. |
There was a problem hiding this comment.
| // If we don't have a path for this document, we cant proceed with it. | |
| // If we don't have a path for this document, we can't proceed with it. |
| if (_documentToLastReportedInformation.TryGetValue(document.Id, out var existingInfo) && | ||
| existingInfo.projectVersion == projectVersion) |
There was a problem hiding this comment.
What's the threading guarantees here? Can RemoveDocumentAsync earlier in the file be running at the same time as this? Is there some risk that we'd remove the document, then process it here, and add it back?
There was a problem hiding this comment.
it cannot be running at the same time. all the incremental-analyzer entrypoints are serialized. the only concurrency is from me spitting out tons of tasks (that i when-all) in one of those incremental-analyzer entrypoints.
@mavasani to confirm.
There was a problem hiding this comment.
all the incremental-analyzer entrypoints are serialized
Yes, that is correct
| // We either haven't computed the designer info, or our data was out of date. We need | ||
| // So recompute here. Figure out what the current category is, and if that's different | ||
| // from what we previously stored. |
There was a problem hiding this comment.
Comment seems stale -- should be deleted now?
There was a problem hiding this comment.
it's still accurate. we have to comptue this and compare against whatever we last reported. teh difference now is that we keep track of what we reported in memory instead of on disk.
There was a problem hiding this comment.
Well, it's still correct, but previously that function was doing the storage comparison. Now this function is blissfully ignorant of how it's being used -- does the comment move elsewhere then?
There was a problem hiding this comment.
i'll revise in next pr.
There was a problem hiding this comment.
revised in next pr.
So a core benefit of IncrementalAnalyzer is that it makes its requests to us serially. I do think i could just make this a Dictionary. however, then i have to be very careful about my own reads/writes in the tasks i spin off to process all docs in a project in parallel. |
|
@CyrusNajmabadi Sounds good. |
|
|
||
| private readonly IPersistentStorageService _storageService; | ||
| /// <summary> | ||
| /// Keep track of the last information we reported. We will avoid notifying the host if we recompute and these |
There was a problem hiding this comment.
host [](start = 89, length = 4)
Should this rather say "the client", not "the host"?
| // easy it would be to get out of that state (just edit the file). | ||
|
|
||
| await PersistLatestInfosAsync(project.Solution, projectVersion, latestInfos, cancellationToken).ConfigureAwait(false); | ||
| // Now, keep track of what we've reported to the host so we won't report unchanged files in the future. |
There was a problem hiding this comment.
host [](start = 61, length = 4)
Client?
Revert "Revert "Merge pull request #46391 from CyrusNajmabadi/formPerf""
There is a possible race condition in this cde (originally documented) that could lead us to persist data to disk that does not match what the host thinks the state of things are. i.e. we compute something is a form, then remote that information over to VS. THe processing of that data on the VS side happens at some point in the future, but we still persist what we think VS knows at that point.
If this gets out of sync (i.e. because a message is lost) user might be in a broken state until they can do something that forces reprocessing.