ollama How to fix “Error: 500 Internal Server Error: unable to load model”

#6
by offbeat1222 - opened

Hi everyone,

I’m encountering the following error when trying to use the model:
Error: 500 Internal Server Error: unable to load model
It seems like the model cannot be loaded, but I’m not sure whether it’s due to a configuration issue, missing files, or something else.

I don't know the details, but I saw some comments of various gpt-oss models which report the same error. I'm encountering this error, too. As far as I know, we can only wait for ollama to update their version to fix the error. For more details, now the most reliable information is this : https://huggingface.co/bartowski/huihui-ai_Huihui-gpt-oss-20b-BF16-abliterated-GGUF/discussions/1, bartowski gave a github issue link of the explaination of this kind of problem.

Hi everyone,

I’m encountering the following error when trying to use the model:
Error: 500 Internal Server Error: unable to load model
It seems like the model cannot be loaded, but I’m not sure whether it’s due to a configuration issue, missing files, or something else.

use koboldcpp,
https://github.com/LostRuins/koboldcpp

Sign up or log in to comment