OphtaBERT Glaucoma Classifier

Binary classification for glaucoma diagnosis extraction from unstructured clinical notes.


Model Details

Model Description

This model is a fine-tuned variant of OphthaBERT, which was pretrained on over 2 million clinical notes. It has been fine-tuned for binary classification on labeled clinical notes from Massachusetts Eye and Ear Infirmary.

  • Finetuned from model: [OphthaBERT-v2]

Uses

We suggest utilizing this model in a zero-shot manner to generate binary glaucoma labels for each clinical note. For continued training on limited data, we recommend freezing the first 10 layers of the model.

Direct Use

Use the code below to get started with the model:

from transformers import AutoModelForSequenceClassification, AutoTokenizer

# Load the fine-tuned model and tokenizer
model = AutoModelForSequenceClassification.from_pretrained("ShahRishi/OphthaBERT-v2-glaucoma-binary")
tokenizer = AutoTokenizer.from_pretrained("ShahRishi/OphthaBERT-v2")

# Example: Classify a clinical note
clinical_note = "Example clinical note text..."
inputs = tokenizer(clinical_note, return_tensors="pt", truncation=True, max_length=512)
outputs = model(**inputs)
Downloads last month
19
Safetensors
Model size
108M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ShahRishi/OphthaBERT-v2-glaucoma-binary

Finetuned
(1)
this model