Enable traced model for text-generation task#22265
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
gante
left a comment
There was a problem hiding this comment.
Thank you for reworking the example @jiqing-feng 🤗
As mentioned in the other PR, I'm going to keep an eye on demand!
amyeroberts
left a comment
There was a problem hiding this comment.
Thanks for this contribution! 🔥
Have you run the script with and without the --jit flag to confirm it runs as expected?
| traced_model = torch.jit.trace(model, jit_inputs, strict=False) | ||
| traced_model = torch.jit.freeze(traced_model.eval()) | ||
| traced_model(*jit_inputs) | ||
| traced_model(*jit_inputs) |
There was a problem hiding this comment.
Why is this line run twice? I'm not super familiar with torch.jit so apologies if I'm missing something.
There was a problem hiding this comment.
Thanks for this contribution! 🔥
Have you run the script with and without the
--jitflag to confirm it runs as expected?
Yes, the example runs as usual without --jit and it also runs as expected with --jit.
Why is this line run twice? I'm not super familiar with
torch.jitso apologies if I'm missing something.
I have tested it on A100 and found that the first two forwards are very slow, mainly because the first forward contains the operation of inserting profiling nodes and the second forward contains the operation of fusion. Refer to jit.
|
@gante Thanks for your attention. Would you please help me to merge it? Thanks! I think the demand for |
@gante Hi, Gante.
Refer to: #22072
Thanks for your advice. This PR only changed the example, would you please help me to review it? Thanks!