-
-
Notifications
You must be signed in to change notification settings - Fork 17.9k
Closed
Description
When running Ollama from the command prompt, you can type the --verbose argument to get timings that output like this:
$ ollama run --verbose llama2
>>> Hi
Hello! It's nice to meet you. Is there something I can help you with or would you like to chat?
total duration: 1.279118717s
load duration: 891.933µs
prompt eval count: 21 token(s)
prompt eval duration: 231.416ms
prompt eval rate: 90.75 tokens/s
eval count: 25 token(s)
eval duration: 1.042407s
eval rate: 23.98 tokens/s
Is there a way to get this to appear in the web UI?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels