Whisper Large SEIN - COES SEIN - Version 9

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the SEIN COES dataset. It achieves the following results on the evaluation set:

  • Loss: 3.8246
  • Wer: 67.8191

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 8000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0123 90.9091 1000 3.1299 68.9362
0.0054 181.8182 2000 3.3876 69.0957
0.0002 272.7273 3000 3.5715 69.5213
0.0001 363.6364 4000 3.6546 68.3511
0.0002 454.5455 5000 3.7197 67.6596
0.0001 545.4545 6000 3.7702 67.5532
0.0001 636.3636 7000 3.8096 67.7660
0.0001 727.2727 8000 3.8246 67.8191

Framework versions

  • Transformers 4.56.0.dev0
  • Pytorch 2.6.0+cu124
  • Datasets 4.0.0
  • Tokenizers 0.21.4
Downloads last month
17
Safetensors
Model size
809M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Cristhian2430/whisper-large-coes-v9

Finetuned
(335)
this model