-
Notifications
You must be signed in to change notification settings - Fork 15.9k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Please include information about your system, the steps to reproduce the bug, and the version of llama.cpp that you are using. If possible, please provide a minimal code example that reproduces the bug.
- I am running using Vulkan on a rx580 with 4gb of vram, I was curious if this is can even work.
- I am using llama-cpp-python to for the chat templates.
- I wasn't expecting the text to be the same on gpu and cpu with the same seed, but was expecting the text to be similar for the phi-2 like it is for the mistral.
- The llama.cpp shared library in llama-cpp-python was build from 2aed77e
Here is the ipython notebook that I used for reference.
https://gist.github.com/alex4o/efac83a009eb42d32d8ec10e68811ab2
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working