Skip to content

fix leak of memory in cache - add settings.CACHE_SIZE_LIMIT#1140

Merged
serhii73 merged 1 commit intoscrapinghub:masterfrom
chebotarevmichael:master
Mar 15, 2023
Merged

fix leak of memory in cache - add settings.CACHE_SIZE_LIMIT#1140
serhii73 merged 1 commit intoscrapinghub:masterfrom
chebotarevmichael:master

Conversation

@chebotarevmichael
Copy link
Copy Markdown
Contributor

PROBLEM: leak of memory.

import dateparser
from datetime import datetime

# ~1.5GB of leaked memory after function finish
def hard_leak():
    for i in range(3000):
        # every call == -0.55MB of leaked memory
        dateparser.parse('dasdasd', settings={'RELATIVE_BASE': datetime.utcnow()})


# ~27MB of leaked memory after function finish
def light_leak():
    for i in range(3000):
        # every call == -0.01MB of leaked memory
        dateparser.parse('12.01.2021', settings={'RELATIVE_BASE': datetime.utcnow()})

After each calling of dateparser.parse new item is added to cache dictionaries:

    _split_regex_cache = {}
    _sorted_words_cache = {}
    _split_relative_regex_cache = {}
    _sorted_relative_strings_cache = {}
    _match_relative_regex_cache = {}

After 3000 calls we will found 3000 items in each of dictionaries, and we have lost few memory. We are forced to stop using this module.

SOLUTION: add a limit (CACHE_SIZE_LIMIT) for max items in caches.

Copy link
Copy Markdown
Contributor

@Gallaecio Gallaecio left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if something like cachetools.LRUCache would be a better choice here, but this seems like an improvement nonetheless.

@serhii73 serhii73 merged commit a11d128 into scrapinghub:master Mar 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants