BiMediX: Bilingual Medical Mixture of Experts LLM
Paper
•
2402.13253
•
Published
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "BiMediX/BiMediX-Bi"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Hello BiMediX! I've been experiencing increased tiredness in the past week."
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=500)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
| Model | CKG | CBio | CMed | MedGen | ProMed | Ana | MedMCQA | MedQA | PubmedQA | AVG |
|---|---|---|---|---|---|---|---|---|---|---|
| Jais-30B | 57.4 | 55.2 | 46.2 | 55.0 | 46.0 | 48.9 | 40.2 | 31.0 | 75.5 | 50.6 |
| Mixtral-8x7B | 59.1 | 57.6 | 52.6 | 59.5 | 53.3 | 54.4 | 43.2 | 40.6 | 74.7 | 55.0 |
| BiMediX (Bilingual) | 70.6 | 72.2 | 59.3 | 74.0 | 64.2 | 59.6 | 55.8 | 54.0 | 78.6 | 65.4 |
Sara Pieri, Sahal Shaji Mullappilly, Fahad Shahbaz Khan, Rao Muhammad Anwer Salman Khan, Timothy Baldwin, Hisham Cholakkal
Mohamed Bin Zayed University of Artificial Intelligence (MBZUAI)
Totally Free + Zero Barriers + No Login Required