deb101 commited on
Commit
f26b46d
·
verified ·
1 Parent(s): 7cdd071

Model save

Browse files
README.md CHANGED
@@ -18,22 +18,22 @@ This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.3](https
18
  It achieves the following results on the evaluation set:
19
  - F1 Micro: 0.0
20
  - F1 Macro: 0.0
21
- - Precision At 5: 0.1412
22
- - Recall At 5: 0.0496
23
- - Precision At 8: 0.1131
24
- - Recall At 8: 0.0628
25
- - Precision At 15: 0.0789
26
- - Recall At 15: 0.0847
27
  - Rare F1 Micro: 0.0
28
  - Rare F1 Macro: 0.0
29
  - Rare Precision: 0.0
30
  - Rare Recall: 0.0
31
- - Rare Precision At 5: 0.0838
32
- - Rare Recall At 5: 0.0276
33
- - Rare Precision At 8: 0.0671
34
- - Rare Recall At 8: 0.0390
35
- - Rare Precision At 15: 0.0549
36
- - Rare Recall At 15: 0.0565
37
  - Not Rare F1 Micro: 0.5956
38
  - Not Rare F1 Macro: 0.3733
39
  - Not Rare Precision: 0.5956
@@ -44,7 +44,7 @@ It achieves the following results on the evaluation set:
44
  - Not Rare Recall At 8: 0.4044
45
  - Not Rare Precision At 15: 0.0270
46
  - Not Rare Recall At 15: 0.4044
47
- - Loss: 0.1050
48
 
49
  ## Model description
50
 
@@ -79,10 +79,10 @@ The following hyperparameters were used during training:
79
 
80
  | Training Loss | Epoch | Step | F1 Micro | F1 Macro | Precision At 5 | Recall At 5 | Precision At 8 | Recall At 8 | Precision At 15 | Recall At 15 | Rare F1 Micro | Rare F1 Macro | Rare Precision | Rare Recall | Rare Precision At 5 | Rare Recall At 5 | Rare Precision At 8 | Rare Recall At 8 | Rare Precision At 15 | Rare Recall At 15 | Not Rare F1 Micro | Not Rare F1 Macro | Not Rare Precision | Not Rare Recall | Not Rare Precision At 5 | Not Rare Recall At 5 | Not Rare Precision At 8 | Not Rare Recall At 8 | Not Rare Precision At 15 | Not Rare Recall At 15 | Validation Loss |
81
  |:-------------:|:------:|:----:|:--------:|:--------:|:--------------:|:-----------:|:--------------:|:-----------:|:---------------:|:------------:|:-------------:|:-------------:|:--------------:|:-----------:|:-------------------:|:----------------:|:-------------------:|:----------------:|:--------------------:|:-----------------:|:-----------------:|:-----------------:|:------------------:|:---------------:|:-----------------------:|:--------------------:|:-----------------------:|:--------------------:|:------------------------:|:---------------------:|:---------------:|
82
- | 0.5906 | 1.0 | 18 | 0.0 | 0.0 | 0.0221 | 0.0044 | 0.0239 | 0.0096 | 0.0284 | 0.0217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0235 | 0.0046 | 0.0276 | 0.0108 | 0.0279 | 0.0217 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.2073 |
83
- | 0.1215 | 2.0 | 36 | 0.0 | 0.0 | 0.0382 | 0.0101 | 0.0358 | 0.0143 | 0.0304 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0382 | 0.0100 | 0.0331 | 0.0132 | 0.0304 | 0.0269 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1218 |
84
- | 0.1081 | 3.0 | 54 | 0.0 | 0.0 | 0.0691 | 0.0225 | 0.0561 | 0.0276 | 0.0456 | 0.0394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0706 | 0.0244 | 0.0561 | 0.0291 | 0.0417 | 0.0370 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1073 |
85
- | 0.1034 | 3.7887 | 68 | 0.0 | 0.0 | 0.1412 | 0.0496 | 0.1131 | 0.0628 | 0.0789 | 0.0847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0838 | 0.0276 | 0.0671 | 0.0390 | 0.0549 | 0.0565 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1050 |
86
 
87
 
88
  ### Framework versions
 
18
  It achieves the following results on the evaluation set:
19
  - F1 Micro: 0.0
20
  - F1 Macro: 0.0
21
+ - Precision At 5: 0.2279
22
+ - Recall At 5: 0.0949
23
+ - Precision At 8: 0.1664
24
+ - Recall At 8: 0.1038
25
+ - Precision At 15: 0.1137
26
+ - Recall At 15: 0.1285
27
  - Rare F1 Micro: 0.0
28
  - Rare F1 Macro: 0.0
29
  - Rare Precision: 0.0
30
  - Rare Recall: 0.0
31
+ - Rare Precision At 5: 0.15
32
+ - Rare Recall At 5: 0.0645
33
+ - Rare Precision At 8: 0.1204
34
+ - Rare Recall At 8: 0.0788
35
+ - Rare Precision At 15: 0.0873
36
+ - Rare Recall At 15: 0.0997
37
  - Not Rare F1 Micro: 0.5956
38
  - Not Rare F1 Macro: 0.3733
39
  - Not Rare Precision: 0.5956
 
44
  - Not Rare Recall At 8: 0.4044
45
  - Not Rare Precision At 15: 0.0270
46
  - Not Rare Recall At 15: 0.4044
47
+ - Loss: 0.1048
48
 
49
  ## Model description
50
 
 
79
 
80
  | Training Loss | Epoch | Step | F1 Micro | F1 Macro | Precision At 5 | Recall At 5 | Precision At 8 | Recall At 8 | Precision At 15 | Recall At 15 | Rare F1 Micro | Rare F1 Macro | Rare Precision | Rare Recall | Rare Precision At 5 | Rare Recall At 5 | Rare Precision At 8 | Rare Recall At 8 | Rare Precision At 15 | Rare Recall At 15 | Not Rare F1 Micro | Not Rare F1 Macro | Not Rare Precision | Not Rare Recall | Not Rare Precision At 5 | Not Rare Recall At 5 | Not Rare Precision At 8 | Not Rare Recall At 8 | Not Rare Precision At 15 | Not Rare Recall At 15 | Validation Loss |
81
  |:-------------:|:------:|:----:|:--------:|:--------:|:--------------:|:-----------:|:--------------:|:-----------:|:---------------:|:------------:|:-------------:|:-------------:|:--------------:|:-----------:|:-------------------:|:----------------:|:-------------------:|:----------------:|:--------------------:|:-----------------:|:-----------------:|:-----------------:|:------------------:|:---------------:|:-----------------------:|:--------------------:|:-----------------------:|:--------------------:|:------------------------:|:---------------------:|:---------------:|
82
+ | 0.6384 | 1.0 | 18 | 0.0 | 0.0 | 0.0368 | 0.0087 | 0.0377 | 0.0174 | 0.0377 | 0.0324 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0294 | 0.0081 | 0.0267 | 0.0107 | 0.0275 | 0.0211 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.2291 |
83
+ | 0.1265 | 2.0 | 36 | 0.0 | 0.0 | 0.0412 | 0.0096 | 0.0395 | 0.0166 | 0.0373 | 0.0303 | 0.0 | 0.0 | 0.0 | 0.0 | 0.025 | 0.0048 | 0.0276 | 0.0109 | 0.0284 | 0.0244 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1216 |
84
+ | 0.1092 | 3.0 | 54 | 0.0 | 0.0 | 0.1162 | 0.0391 | 0.1002 | 0.0595 | 0.0814 | 0.0890 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0603 | 0.0208 | 0.0579 | 0.0332 | 0.0564 | 0.0595 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1069 |
85
+ | 0.1033 | 3.7887 | 68 | 0.0 | 0.0 | 0.2279 | 0.0949 | 0.1664 | 0.1038 | 0.1137 | 0.1285 | 0.0 | 0.0 | 0.0 | 0.0 | 0.15 | 0.0645 | 0.1204 | 0.0788 | 0.0873 | 0.0997 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1048 |
86
 
87
 
88
  ### Framework versions
eval_loss_plot.png CHANGED
eval_precision_at_15_plot.png CHANGED
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3c6d5f892d7fc0071e730fc5e3fdc7d6acb1e0fc5dafd0b725cd748a3ade2068
3
  size 4475046623
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:db7eacc60a6dd0a0262abac061bf39e73ac38ad1be6f8932b97d9eae9afd010f
3
  size 4475046623
train_loss_plot.png CHANGED
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b02e684f463b01edfeb7ab51b4c7d2a9fd23d8d212075aabfec2d95cd917b320
3
  size 5496
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eaf7616c17f954b75d442bc20e92cac70f690ba01bc437ab67b0bf4fbac6f6fc
3
  size 5496