Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
ArthurZucker
left a comment
There was a problem hiding this comment.
Sounds good, I am a bit curious about the rational behind not using / creating a rust DecodeStream
| /// | ||
| #[pyclass(module = "tokenizers.decoders", name = "DecodeStream")] | ||
| #[derive(Clone)] | ||
| pub struct PyDecodeStream { |
There was a problem hiding this comment.
intersting, so we don't use the rust DecodeStream object. I am guessing it's for ownership reasons? Otherwise we needs to wrap the PyDecodeStream {stream: DecodeStream} with arc?
There was a problem hiding this comment.
Arc doesn't save us. The issue is with the borrow of the tokenizer's lifetime.
You can technically reborrow in every call in rust too, it's just not very "rusty".
For Python, we need to get access to the tokenizer on every call, and cloning into an Arc feels super wasteful (and breaks every update you might do on the tokenizer afterwards).
This seems innocuous enough since currently users have to hold already a reference to the tokenizer anyway.
| } | ||
| } | ||
|
|
||
| /// Internal function exposed only to bypass python limitations |
There was a problem hiding this comment.
what were the limitations?
Lifetimes cannot cross the boundary. |
Different API because Python cannot handle lifetimes properly.
e16eaac to
70c4f6b
Compare
| #[pymethods] | ||
| impl PyDecodeStream { | ||
| #[new] | ||
| #[pyo3(signature = (skip_special_tokens), text_signature = "(self, skip_special_tokens)")] |
There was a problem hiding this comment.
Thanks for this API! Im working on integrating it into VLLM.
QQ - this API is clear for step(). However, would it be possible to pass the prefill tokens to new()?
IIUC, the current API requires me to call step N times for N prefill tokens before I get into the decode phase. Is that right?
There was a problem hiding this comment.
Yep we'll add if not already possible!
Follow up of #1677