This model has been pushed to the Hub using the PytorchModelHubMixin integration:

Usage

Here's how to use the model for inference:

from model.model import TwinLiteNetPlus

model = TwinLiteNetPlus.from_pretrained("nielsr/twinlitenetplus-nano")
Downloads last month
4
Safetensors
Model size
33.9k params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support