metadata
pipeline_tag: text-generation
inference: false
license: apache-2.0
library_name: transformers
tags:
- language
- granite-3.3
- openvino
- openvino-export
base_model: ibm-granite/granite-3.3-2b-instruct
This model was converted to OpenVINO from ibm-granite/granite-3.3-2b-instruct
using optimum-intel
via the export space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "sellep/granite-3.3-2b-instruct-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)