Skip to content

Conversation

@sfc-gh-jkinkead
Copy link
Contributor

@sfc-gh-jkinkead sfc-gh-jkinkead commented Dec 22, 2025

Describe your changes

Replace the TTLCache in ResourceCache with a TTLCleanupCache.

Add on_release to the st.cache_resource API, and plumb it through to the TTLCleanupCache.

Update st.cache_resource.clear_all to clear the cache directly instead of just GCing it to ensure release functions are called.

Implements feature request in #8674 .

GitHub Issue Link (if applicable)

Testing Plan

  • Unit Tests (JS and/or Python)

Implemented.


Contribution License Agreement

By submitting this pull request you agree that all contributions to this project are made under the Apache 2.0 license.

Replace the TTLCache in ResourceCache with a TTLCleanupCache.

Add on_release to the st.cache_resource API, and plumb it through to the TTLCleanupCache.

Update st.cache_resource.clear_all to clear the cache directly instead of just GCing it to ensure release functions are called.

Implements feature request in #8764 .
Copilot AI review requested due to automatic review settings December 22, 2025 22:00
@snyk-io
Copy link
Contributor

snyk-io bot commented Dec 22, 2025

Snyk checks have passed. No issues have been found so far.

Status Scanner Critical High Medium Low Total (0)
Open Source Security 0 0 0 0 0 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

@github-actions
Copy link
Contributor

github-actions bot commented Dec 22, 2025

✅ PR preview is ready!

Name Link
📦 Wheel file https://core-previews.s3-us-west-2.amazonaws.com/pr-13439/streamlit-1.52.2-py3-none-any.whl
📦 @streamlit/component-v2-lib Download from artifacts
🕹️ Preview app pr-13439.streamlit.app (☁️ Deploy here if not accessible)

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds an on_release callback parameter to st.cache_resource, allowing users to specify cleanup functions that are automatically called when cached resources are removed from the cache. This implements feature request #8674.

Key Changes

  • Added on_release parameter to the st.cache_resource API with comprehensive documentation
  • Replaced TTLCache with TTLCleanupCache in ResourceCache to support release callbacks
  • Updated ResourceCaches.clear_all() to explicitly call clear() on each cache before removing them, ensuring release functions are invoked

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 4 comments.

File Description
lib/streamlit/runtime/caching/cache_resource_api.py Added on_release parameter throughout the caching layers, updated imports to use TTLCleanupCache, modified clear_all() to trigger callbacks, and added comprehensive documentation
lib/tests/streamlit/runtime/caching/cache_resource_api_test.py Added test_on_release_fires() to verify callbacks are invoked on cache eviction and explicit clearing

@github-actions
Copy link
Contributor

github-actions bot commented Dec 22, 2025

📉 Frontend coverage change detected

The frontend unit test (vitest) coverage has decreased by 0.0000%

  • Current PR: 86.4800% (12747 lines, 1723 missed)
  • Latest develop: 86.4800% (12747 lines, 1723 missed)

✅ Coverage change is within normal range.

📊 View detailed coverage comparison

@lukasmasuch lukasmasuch added security-assessment-completed Security assessment has been completed for PR change:feature PR contains new feature or enhancement implementation impact:users PR changes affect end users labels Dec 23, 2025
Copy link
Collaborator

@lukasmasuch lukasmasuch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍

Comment on lines 144 to 145
for cache in self._function_caches.values():
cache.clear()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: Maybe add a brief comment here that calling clear explicitly is required to trigger the on-release hooks.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

This sent me down a rabbit hole, as I realized I hadn't checked the behavior when multiple items are being cleared and one throws an exception. Now, this is handled correctly, and has some test coverage.

validate: ValidateFunc | None,
hash_funcs: HashFuncsDict | None = None,
show_time: bool = False,
on_release: OnRelease = _no_op_release,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: might be slightly more consistent if we support on_release: OnRelease | None = None here and just pass down the None value from the decorator and use on_release or _no_op_release in the initialization.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, Done.

Comment on lines 387 to 389
This is not used as a part of the cache key - meaning changes to this
function between script runs will not trigger a new resource being
generated.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: This note might be a bit misleading/inconsistent since - afaik - none of the cache configuration parameters are part of the cache key or trigger a cache regeneration. Maybe we can simplify this to just add a top-level note that changing cache configuration parameter will not invalidate the existing cached entries.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

max_entries and ttl are both part of the (function) cache key, and changes to these will invalidate cache entries.

While validate is in that if block, the invoked helper just checks to see if the Nonefulness has changed.

I'm happy to remove this, if you think it's just adding confusion.

Copy link
Collaborator

@lukasmasuch lukasmasuch Dec 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I didn't know that, but makes sense. I was looking at the cache key generated here which doesn't seem to be impacted:

def _make_function_key(cache_type: CacheType, func: Callable[..., Any]) -> str:

and the value key:

I think its worth documenting, but maybe better to mention this in the ttl and max_entries docstrings that changing it invalidates the existing cache entries.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated! Also noted that Pandas (and therefore this helper) treats unitless string TTLs as nanoseconds, which has caused confusion in the past.

@streamlit streamlit deleted a comment from cursor bot Dec 23, 2025
Copy link
Contributor Author

@sfc-gh-jkinkead sfc-gh-jkinkead left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the quick review!

Comment on lines 144 to 145
for cache in self._function_caches.values():
cache.clear()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

This sent me down a rabbit hole, as I realized I hadn't checked the behavior when multiple items are being cleared and one throws an exception. Now, this is handled correctly, and has some test coverage.

validate: ValidateFunc | None,
hash_funcs: HashFuncsDict | None = None,
show_time: bool = False,
on_release: OnRelease = _no_op_release,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, Done.

Comment on lines 387 to 389
This is not used as a part of the cache key - meaning changes to this
function between script runs will not trigger a new resource being
generated.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

max_entries and ttl are both part of the (function) cache key, and changes to these will invalidate cache entries.

While validate is in that if block, the invoked helper just checks to see if the Nonefulness has changed.

I'm happy to remove this, if you think it's just adding confusion.

@sfc-gh-jkinkead
Copy link
Contributor Author

@lukasmasuch - I'll likely merge by EOD. Feel free to add more comments if you have them, and I'll address in a follow-up if there are any!

sfc-gh-jkinkead and others added 4 commits December 23, 2025 13:41
Fix typo.

Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>
Add note about default Pandas units on ttl, as this can be tricky.
@sfc-gh-jkinkead sfc-gh-jkinkead merged commit 59d33fe into develop Dec 24, 2025
42 checks passed
@sfc-gh-jkinkead sfc-gh-jkinkead deleted the jkinkead-on-release-resources branch December 24, 2025 04:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

change:feature PR contains new feature or enhancement implementation impact:users PR changes affect end users security-assessment-completed Security assessment has been completed for PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Be able to close cached resources (@st.cache_resource)

3 participants