Quantization support
#5
by
princemjp
- opened
Would there be any awq or bitsandbytes quantization support for this model?
Most likely, in a month and a half they will add GPTQ-Int8 and int4
Would there be any awq or bitsandbytes quantization support for this model?
Most likely, in a month and a half they will add GPTQ-Int8 and int4