Ollama

#6
by dangrsosig - opened

None of the ggufs works in ollama. You can pull them but trying to run them results in this error message:

Error: 500 Internal Server Error: unable to load model

Ollama likely needs modifications; it may be using an outdated version of llama.cpp. Replace it with the latest version and recompile.

Just tried to run it on latest version of Ollama on macOS, it didn't work : /

ollama run hf.co/huihui-ai/Huihui-gpt-oss-20b-BF16-abliterated
Error: 500 Internal Server Error: unable to load model: /Users/yqbqwlny/.ollama/models/blobs/sha256-e01aba477beff0c8c43bf4c0faa8b1b14ceaa1adba8d7849a30cb8ba79a8eeda

@huihui-ai thank you for your amazing work! can’t wait to test the model. any chance you could prepare it for Ollama as well? I find Ollama much more user-friendly compared to other options

Sign up or log in to comment