Is there any code available for finetuning a LoRA and merging it with this specific version of Pixtral-12B? (not the transformers compatible version)
· Sign up or log in to comment