Skip to content

Wrong output in phi-2 on Vulkan #5203

Closed
@alex4o

Description

@alex4o

Please include information about your system, the steps to reproduce the bug, and the version of llama.cpp that you are using. If possible, please provide a minimal code example that reproduces the bug.

  1. I am running using Vulkan on a rx580 with 4gb of vram, I was curious if this is can even work.
  2. I am using llama-cpp-python to for the chat templates.
  3. I wasn't expecting the text to be the same on gpu and cpu with the same seed, but was expecting the text to be similar for the phi-2 like it is for the mistral.
  4. The llama.cpp shared library in llama-cpp-python was build from 2aed77e

Here is the ipython notebook that I used for reference.
https://gist.github.com/alex4o/efac83a009eb42d32d8ec10e68811ab2

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions