bert-base-NER-onnx / README.md
asofter's picture
Create README.md
4d3ee42
|
raw
history blame
1.35 kB
metadata
language:
  - en
inference: false
pipeline_tag: token-classification
tags:
  - ner
  - bert
license: mit
datasets:
  - conll2003

ONNX version of dslim/bert-base-NER

This model is a conversion of dslim/bert-base-NER to ONNX format using the 🤗 Optimum library.

bert-base-NER is a fine-tuned BERT model that is ready to use for Named Entity Recognition and achieves state-of-the-art performance for the NER task. It has been trained to recognize four types of entities: location (LOC), organizations (ORG), person (PER) and Miscellaneous (MISC).

Specifically, this model is a bert-base-cased model that was fine-tuned on the English version of the standard CoNLL-2003 Named Entity Recognition dataset.

Usage

Loading the model requires the 🤗 Optimum library installed.

from optimum.onnxruntime import ORTModelForTokenClassification
from transformers import AutoTokenizer, pipeline


tokenizer = AutoTokenizer.from_pretrained("laiyer/bert-base-NER-onnx")
model = ORTModelForTokenClassification.from_pretrained("laiyer/bert-base-NER-onnx")
ner = pipeline(
    task="ner",
    model=model,
    tokenizer=tokenizer,
)

ner_output = ner("My name is John Doe.")
print(ner_output)