Skip to content

Improved cache control, fewer memory leaks#689

Merged
cdata merged 5 commits intomasterfrom
improved-cache-control
Aug 12, 2019
Merged

Improved cache control, fewer memory leaks#689
cdata merged 5 commits intomasterfrom
improved-cache-control

Conversation

@cdata
Copy link
Contributor

@cdata cdata commented Aug 3, 2019

This change proposes a first step towards better cache management and introduces limited cache control as a public API. Highlights include:

  • CacheEvictionPolicy has been introduced, to offer a consistent model for purging items from caches
  • A few memory leaks have been fixed:
    • We were leaking all <model-viewer> elements via our integration with :focus-visible polyfill
    • We have been leaking vertex buffers in a few places due to not disposing of them appropriately
  • CachingGLTFLoader has been integrated with CacheEvictionPolicy so that we no longer leak all of our models by design
  • A minor bug in cloneGltf has been fixed, which was leading us to incorrectly clone materials
  • A public, static, writable property modelCacheSize has been added, to enable imperative user control of GLTF caching behavior
    • The value is global to all <model-viewer> instances and sets the "eviction threshold" for CachingGLTFLoader
    • It can be used this way: customElements.get('model-viewer').modelCacheSize = 10;.
  • The Android test target has been removed, due to an undiagnosed critical failure that causes it to crash only on Sauce Labs

Future work:

Fixes #491

@cdata cdata force-pushed the improved-cache-control branch 3 times, most recently from 1c22f12 to ba83a5b Compare August 3, 2019 04:44
@cdata cdata requested a review from elalish August 5, 2019 16:54
Copy link
Contributor

@elalish elalish left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for doing this! I have a few questions below...

index.html Outdated
<p>This static, writable property sets &lt;model-viewer&gt;'s internal
cache size for GLTF models. Note that any number of models may be cached
at any given time. This number just sets the maximum number of models
that can be cached even if they are not being used by a &lt;model-viewer&gt;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reading this, I'm uncertain if this number has units of Mb or # of glTFs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 will clarify

suiteTeardown(() => {
// Reset the renderer once at the end of this spec, to clear out all
// of the heavy cached image buffers:
ModelViewerElementBase[$resetRenderer]();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we really want to destroy the gl context here? Would it be adequate to dispose of and recreate TextureUtils?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we have discussed before, the GL context isn't especially important. What this does do is clear out a few internal Three.js caches. It's possible that we could dispose of the artifacts held in these caches more discretely, but I'm not sure how to go about doing that comprehensively.

This is only ever done once in the entire test suite (after environmennt-spec.ts is run to completion), so it shouldn't be a huge cost 🤷‍♂

suiteTeardown(() => {
// Ensure we free up memory from loading large environment maps:
Cache.clear();
renderer.dispose();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does disposing of the renderer do that Cache.clear() doesn't do already? Every time we recreate a renderer we get a lot of console spew that makes reading the test output harder.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See my comment above, I linked to the relevant lines. It's a hatchet, not a scalpel, for sure, but profiles bare out that there is a fair amount of memory (megabytes worth) being held by references in the renderer's internal cache.

this[$evictionPolicy].retain(url);

Object.defineProperty(model, $releaseFromCache, {
value: (() => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't seem like our usual style; what's the purpose of Object.defineProperty here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's the only way I could get tsc to stop complaining about me decorating this object.

It might be possible to change the type of the identifier and get a cleaner result. I'll try that out.

const cloneSkinnedMeshes: SkinnedMeshMap = {};

if (hasScene && hasSkinnedMeshes) {
if (hasScene) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we combine this with the same if statement above?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, good call, I think we can merge the conditions.

environmentMap.dispose();
}
} catch (e) {
// Suppress errors
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What errors are we getting?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If a promise is rejected (for example. say a file fails to load), the promise is rejected for all time. If you later await that promise, it will throw an error.

src/utilities.ts Outdated
export type Constructor<T = object> = {
new (...args: any[]): T,
prototype: T
prototype: T,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

intentional?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will remove


if (retainerCount === 0) {
this[$cache].delete(key);
this[$recentlyUsed].splice(i, 1);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't quite follow this splice. Why are we putting a new entry into $recentlyUsed when we just deleted that entry from the cache?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The signature for JavaScript splice is:

var arrDeletedItems = array.splice(start[, deleteCount[, item1[, item2[, ...]]]])

Added items start at the third argument. So, we are only removing things here, not adding anything.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, that's what I missed when I read that definition; I thought these splices were adding elements, but now I see they're removing them. I thought elements weren't being removed at all, hence my other comment. Thanks!

this[$recentlyUsed].splice(recentlyUsedIndex, 1);
}

this[$recentlyUsed].unshift(key);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These two lines seem to be vaguely duplicate? Also, does the $recentlyUsed array grow without bound? Could we just make $recentlyUsed the same length as $retainerCount and just swap elements to the beginning? If only JS had a priority queue...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The JavaScript Array is unbounded and can grow indefinitely, runtime permitting. One cannot limit the size of an Array. JavaScript Array is more like a map than an actual array, where the indices are the keys. You can overload its push / pop etc. methods, but you cannot overload the index operator, and that allows you to insert to arbitrary indices (e.g., this[$recentlyUsed][49283] = true will totally do a thing here, and grow the reported length accordingly).

[$retainerCount] being a Map, it is more expensive to count the keys than to just do the book keeping directly as we retain new keys. So, surgically splicing elements out of the Array is probably the best we can do.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, see above.

disconnectedCallback() {
super.disconnectedCallback && super.disconnectedCallback();
// It's important to remove the polyfill event listener when we
// disconnect, otherwise we will leak the whole element via window:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting!

@cdata cdata force-pushed the improved-cache-control branch from 95edcdd to b52e78b Compare August 5, 2019 19:15
@cdata cdata marked this pull request as ready for review August 5, 2019 20:52
@cdata cdata requested a review from elalish August 5, 2019 20:54
elalish
elalish previously approved these changes Aug 5, 2019
Copy link
Contributor

@elalish elalish left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@cdata cdata changed the base branch from newPMREM to master August 12, 2019 16:44
@cdata cdata force-pushed the improved-cache-control branch from 99f520f to cd371ad Compare August 12, 2019 17:06
@cdata
Copy link
Contributor Author

cdata commented Aug 12, 2019

This change has been rebased onto master

@cdata cdata merged commit c5df528 into master Aug 12, 2019
@cdata cdata deleted the improved-cache-control branch August 12, 2019 17:25
@cdata cdata added this to the v0.6.0 milestone Aug 13, 2019
@cdata cdata self-assigned this Aug 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Enable better cache control and/or cache purging

2 participants