--- tags: - generated_from_trainer base_model: sentence-transformers/multi-qa-MiniLM-L6-cos-v1 metrics: - accuracy - precision - recall - f1 model-index: - name: all_keywords_multi-qa-MiniLM-L6-cos-v1_another results: [] --- # all_keywords_multi-qa-MiniLM-L6-cos-v1_another This model is a fine-tuned version of [sentence-transformers/multi-qa-MiniLM-L6-cos-v1](https://huggingface.co/sentence-transformers/multi-qa-MiniLM-L6-cos-v1) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.2408 - Accuracy: 0.5259 - Precision: 0.5259 - Recall: 0.5259 - F1: 0.5259 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:------:| | 2.3237 | 1.0 | 712 | 1.9365 | 0.4123 | 0.4123 | 0.4123 | 0.4123 | | 2.0274 | 2.0 | 1424 | 1.8781 | 0.4502 | 0.4502 | 0.4502 | 0.4502 | | 1.7498 | 3.0 | 2136 | 1.7664 | 0.4656 | 0.4656 | 0.4656 | 0.4656 | | 1.6009 | 4.0 | 2848 | 1.7801 | 0.4656 | 0.4656 | 0.4656 | 0.4656 | | 1.2981 | 5.0 | 3560 | 1.9004 | 0.4600 | 0.4600 | 0.4600 | 0.4600 | | 1.0895 | 6.0 | 4272 | 2.0621 | 0.4642 | 0.4642 | 0.4642 | 0.4642 | | 0.9878 | 7.0 | 4984 | 2.4640 | 0.4572 | 0.4572 | 0.4572 | 0.4572 | | 0.784 | 8.0 | 5696 | 2.5635 | 0.5091 | 0.5091 | 0.5091 | 0.5091 | | 0.6569 | 9.0 | 6408 | 2.5690 | 0.5273 | 0.5273 | 0.5273 | 0.5273 | | 0.5847 | 10.0 | 7120 | 2.9342 | 0.5063 | 0.5063 | 0.5063 | 0.5063 | | 0.5107 | 11.0 | 7832 | 2.9652 | 0.5091 | 0.5091 | 0.5091 | 0.5091 | | 0.4954 | 12.0 | 8544 | 3.1480 | 0.5161 | 0.5161 | 0.5161 | 0.5161 | | 0.4274 | 13.0 | 9256 | 3.2199 | 0.4993 | 0.4993 | 0.4993 | 0.4993 | | 0.433 | 14.0 | 9968 | 3.2185 | 0.5217 | 0.5217 | 0.5217 | 0.5217 | | 0.3615 | 15.0 | 10680 | 3.2408 | 0.5259 | 0.5259 | 0.5259 | 0.5259 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.2.1+cu118 - Datasets 2.14.7 - Tokenizers 0.15.2