Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
ArthurZucker
left a comment
There was a problem hiding this comment.
Thanks, closed #1662!
|
@Narsil @ArthurZucker excuse me, i am facing a memory leaks problem may caused by this. And could you help me to disable the cache in this version? |
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(_path)
tokenizer._tokenizer.model.cache_capacity(0)where does |
AttributeError: 'tokenizers.models.BPE' object has no attribute 'cache_capacity' |
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(_path)
tokenizer._tokenizer.model._clear_cache()or from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(_path)
tokenizer._tokenizer.model._resize_cache(0) |
|
You can find the functions using |
|
do you want to open a PR to add some doc? |
no, i just won't to disable cache. but i only can do it by backend_tokenizer. |
will disable it |
|
|
the tokenizers generated by transformers seems have. |
|
No, the |
Fixes #1539