Automatic Speech Recognition
Transformers
PyTorch
TensorBoard
whisper
whisper-event
Generated from Trainer
hf-asr-leaderboard
Eval Results (legacy)
Instructions to use arbml/whisper-small-ar with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use arbml/whisper-small-ar with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("automatic-speech-recognition", model="arbml/whisper-small-ar")# Load model directly from transformers import AutoProcessor, AutoModelForSpeechSeq2Seq processor = AutoProcessor.from_pretrained("arbml/whisper-small-ar") model = AutoModelForSpeechSeq2Seq.from_pretrained("arbml/whisper-small-ar") - Notebooks
- Google Colab
- Kaggle
| pip install -r requirements.txt | |
| git lfs install | |
| python -c "import torch; print(torch.cuda.is_available())" | |
| git config --global credential.helper store | |
| huggingface-cli login | |
| huggingface-cli repo create $2 | |
| git clone https://huggingface.co/$1/$2 | |
| cd $2 | |
| cp ../**.py . | |
| cp ../**.sh . | |
| cp ../**.ipynb . | |
| cp ../ds_config.js . | |
| wget https://raw.githubusercontent.com/huggingface/community-events/main/whisper-fine-tuning-event/fine-tune-whisper-non-streaming.ipynb | |