How is the model supposed to be deployed with vLLM?

#1
by diuibyang - opened

When I tried to deploy the model with VLLM, there is error AttributeError: Model DotsOCRForCausalLM does not support BitsAndBytes quantization yet. No 'packed_modules_mapping' found

Was faced with the same issue

Hello, sorry for late response.

Currently, vllm is not supporting dots.ocr ( DotsOCRForCausalLM ). If there is any update, I can update this model or release a similar model that supports vllm as well.

You might follow as:
https://huggingface.co/rednote-hilab/dots.ocr/discussions/20

Sign up or log in to comment