ollama
every fork of this and this shows 500: unable to load model: /root/.ollama/models/blobs
Ollama likely needs modifications; it may be using an outdated version of llama.cpp. Replace it with the latest version and recompile.
I am using Ollama 0.11.4 which is the latest one, I install it from Ollama assets, do I need to mannually recompile it to fix the issue?
Do you need to upgrade to the latest version of llama.cpp? It might be quite complex, or you could wait for the official ollama release.
@huihui-ai
Could you release some ggufs for ollama, and changing general.architecture from gpt-oss to gptoss,
as in ollama, gpt-oss is registered as gptoss.
source link: https://github.com/ollama/ollama/commit/fa7776fd2458fc3a8aeb7f12e4bc65b439955319
Let's investigate and see if we can resolve this issue.