End of training
Browse files
    	
        README.md
    ADDED
    
    | @@ -0,0 +1,96 @@ | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            ---
         | 
| 2 | 
            +
            library_name: transformers
         | 
| 3 | 
            +
            language:
         | 
| 4 | 
            +
            - nl
         | 
| 5 | 
            +
            license: mit
         | 
| 6 | 
            +
            base_model: microsoft/speecht5_tts
         | 
| 7 | 
            +
            tags:
         | 
| 8 | 
            +
            - generated_from_trainer
         | 
| 9 | 
            +
            datasets:
         | 
| 10 | 
            +
            - procit007/saskia_may23_39768
         | 
| 11 | 
            +
            model-index:
         | 
| 12 | 
            +
            - name: speecht5_tts_v1_100
         | 
| 13 | 
            +
              results: []
         | 
| 14 | 
            +
            ---
         | 
| 15 | 
            +
             | 
| 16 | 
            +
            <!-- This model card has been generated automatically according to the information the Trainer had access to. You
         | 
| 17 | 
            +
            should probably proofread and complete it, then remove this comment. -->
         | 
| 18 | 
            +
             | 
| 19 | 
            +
            # speecht5_tts_v1_100
         | 
| 20 | 
            +
             | 
| 21 | 
            +
            This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the saskia_may23_39768 dataset.
         | 
| 22 | 
            +
            It achieves the following results on the evaluation set:
         | 
| 23 | 
            +
            - Loss: 0.4580
         | 
| 24 | 
            +
             | 
| 25 | 
            +
            ## Model description
         | 
| 26 | 
            +
             | 
| 27 | 
            +
            More information needed
         | 
| 28 | 
            +
             | 
| 29 | 
            +
            ## Intended uses & limitations
         | 
| 30 | 
            +
             | 
| 31 | 
            +
            More information needed
         | 
| 32 | 
            +
             | 
| 33 | 
            +
            ## Training and evaluation data
         | 
| 34 | 
            +
             | 
| 35 | 
            +
            More information needed
         | 
| 36 | 
            +
             | 
| 37 | 
            +
            ## Training procedure
         | 
| 38 | 
            +
             | 
| 39 | 
            +
            ### Training hyperparameters
         | 
| 40 | 
            +
             | 
| 41 | 
            +
            The following hyperparameters were used during training:
         | 
| 42 | 
            +
            - learning_rate: 1e-05
         | 
| 43 | 
            +
            - train_batch_size: 4
         | 
| 44 | 
            +
            - eval_batch_size: 2
         | 
| 45 | 
            +
            - seed: 42
         | 
| 46 | 
            +
            - gradient_accumulation_steps: 8
         | 
| 47 | 
            +
            - total_train_batch_size: 32
         | 
| 48 | 
            +
            - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
         | 
| 49 | 
            +
            - lr_scheduler_type: linear
         | 
| 50 | 
            +
            - lr_scheduler_warmup_steps: 500
         | 
| 51 | 
            +
            - num_epochs: 90
         | 
| 52 | 
            +
             | 
| 53 | 
            +
            ### Training results
         | 
| 54 | 
            +
             | 
| 55 | 
            +
            | Training Loss | Epoch   | Step  | Validation Loss |
         | 
| 56 | 
            +
            |:-------------:|:-------:|:-----:|:---------------:|
         | 
| 57 | 
            +
            | 0.5267        | 2.7939  | 1000  | 0.4852          |
         | 
| 58 | 
            +
            | 0.4925        | 5.5870  | 2000  | 0.4635          |
         | 
| 59 | 
            +
            | 0.488         | 8.3802  | 3000  | 0.4534          |
         | 
| 60 | 
            +
            | 0.4771        | 11.1733 | 4000  | 0.4528          |
         | 
| 61 | 
            +
            | 0.4703        | 13.9672 | 5000  | 0.4473          |
         | 
| 62 | 
            +
            | 0.4566        | 16.7603 | 6000  | 0.4435          |
         | 
| 63 | 
            +
            | 0.4548        | 19.5535 | 7000  | 0.4532          |
         | 
| 64 | 
            +
            | 0.4499        | 22.3466 | 8000  | 0.4473          |
         | 
| 65 | 
            +
            | 0.456         | 25.1398 | 9000  | 0.4446          |
         | 
| 66 | 
            +
            | 0.4531        | 27.9336 | 10000 | 0.4457          |
         | 
| 67 | 
            +
            | 0.4479        | 30.7268 | 11000 | 0.4495          |
         | 
| 68 | 
            +
            | 0.4461        | 33.5199 | 12000 | 0.4430          |
         | 
| 69 | 
            +
            | 0.4384        | 36.3131 | 13000 | 0.4410          |
         | 
| 70 | 
            +
            | 0.4346        | 39.1062 | 14000 | 0.4475          |
         | 
| 71 | 
            +
            | 0.4399        | 41.9001 | 15000 | 0.4461          |
         | 
| 72 | 
            +
            | 0.4342        | 44.6932 | 16000 | 0.4455          |
         | 
| 73 | 
            +
            | 0.433         | 47.4864 | 17000 | 0.4486          |
         | 
| 74 | 
            +
            | 0.4344        | 50.2795 | 18000 | 0.4568          |
         | 
| 75 | 
            +
            | 0.433         | 53.0727 | 19000 | 0.4490          |
         | 
| 76 | 
            +
            | 0.4318        | 55.8665 | 20000 | 0.4554          |
         | 
| 77 | 
            +
            | 0.4292        | 58.6597 | 21000 | 0.4535          |
         | 
| 78 | 
            +
            | 0.4289        | 61.4528 | 22000 | 0.4510          |
         | 
| 79 | 
            +
            | 0.4284        | 64.2460 | 23000 | 0.4534          |
         | 
| 80 | 
            +
            | 0.43          | 67.0391 | 24000 | 0.4489          |
         | 
| 81 | 
            +
            | 0.4277        | 69.8330 | 25000 | 0.4541          |
         | 
| 82 | 
            +
            | 0.429         | 72.6261 | 26000 | 0.4548          |
         | 
| 83 | 
            +
            | 0.423         | 75.4193 | 27000 | 0.4612          |
         | 
| 84 | 
            +
            | 0.4265        | 78.2124 | 28000 | 0.4516          |
         | 
| 85 | 
            +
            | 0.4344        | 81.0056 | 29000 | 0.4584          |
         | 
| 86 | 
            +
            | 0.4303        | 83.7994 | 30000 | 0.4610          |
         | 
| 87 | 
            +
            | 0.4279        | 86.5926 | 31000 | 0.4562          |
         | 
| 88 | 
            +
            | 0.428         | 89.3857 | 32000 | 0.4580          |
         | 
| 89 | 
            +
             | 
| 90 | 
            +
             | 
| 91 | 
            +
            ### Framework versions
         | 
| 92 | 
            +
             | 
| 93 | 
            +
            - Transformers 4.56.0.dev0
         | 
| 94 | 
            +
            - Pytorch 2.6.0+cu124
         | 
| 95 | 
            +
            - Datasets 3.6.0
         | 
| 96 | 
            +
            - Tokenizers 0.21.2
         | 
    	
        generation_config.json
    ADDED
    
    | @@ -0,0 +1,11 @@ | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            {
         | 
| 2 | 
            +
              "_from_model_config": true,
         | 
| 3 | 
            +
              "bos_token_id": 0,
         | 
| 4 | 
            +
              "decoder_start_token_id": 2,
         | 
| 5 | 
            +
              "eos_token_id": [
         | 
| 6 | 
            +
                2
         | 
| 7 | 
            +
              ],
         | 
| 8 | 
            +
              "max_length": 1876,
         | 
| 9 | 
            +
              "pad_token_id": 1,
         | 
| 10 | 
            +
              "transformers_version": "4.56.0.dev0"
         | 
| 11 | 
            +
            }
         | 
    	
        model.safetensors
    CHANGED
    
    | @@ -1,3 +1,3 @@ | |
| 1 | 
             
            version https://git-lfs.github.com/spec/v1
         | 
| 2 | 
            -
            oid sha256: | 
| 3 | 
             
            size 577789320
         | 
|  | |
| 1 | 
             
            version https://git-lfs.github.com/spec/v1
         | 
| 2 | 
            +
            oid sha256:6a40e4de634d381177c9de2d120ce6080ee038388592d43d74c5e5739d85f903
         | 
| 3 | 
             
            size 577789320
         | 
    	
        runs/Aug22_16-19-47_240c73c9d929/events.out.tfevents.1755879592.240c73c9d929.36.1
    CHANGED
    
    | @@ -1,3 +1,3 @@ | |
| 1 | 
             
            version https://git-lfs.github.com/spec/v1
         | 
| 2 | 
            -
            oid sha256: | 
| 3 | 
            -
            size  | 
|  | |
| 1 | 
             
            version https://git-lfs.github.com/spec/v1
         | 
| 2 | 
            +
            oid sha256:cc4607bf106cc9a3edb0ce4441b92420333718166fb6d29e4741d31f919b8d83
         | 
| 3 | 
            +
            size 289945
         |