deb101 commited on
Commit
e569498
·
verified ·
1 Parent(s): aaf7431

Model save

Browse files
README.md CHANGED
@@ -18,22 +18,22 @@ This model is a fine-tuned version of [mistralai/Mistral-7B-Instruct-v0.3](https
18
  It achieves the following results on the evaluation set:
19
  - F1 Micro: 0.0
20
  - F1 Macro: 0.0
21
- - Precision At 5: 0.1324
22
- - Recall At 5: 0.0502
23
- - Precision At 8: 0.1066
24
- - Recall At 8: 0.0676
25
  - Precision At 15: 0.0789
26
- - Recall At 15: 0.0888
27
  - Rare F1 Micro: 0.0
28
  - Rare F1 Macro: 0.0
29
  - Rare Precision: 0.0
30
  - Rare Recall: 0.0
31
- - Rare Precision At 5: 0.0691
32
- - Rare Recall At 5: 0.0306
33
- - Rare Precision At 8: 0.0579
34
- - Rare Recall At 8: 0.0355
35
- - Rare Precision At 15: 0.0539
36
- - Rare Recall At 15: 0.0584
37
  - Not Rare F1 Micro: 0.5956
38
  - Not Rare F1 Macro: 0.3733
39
  - Not Rare Precision: 0.5956
@@ -44,7 +44,7 @@ It achieves the following results on the evaluation set:
44
  - Not Rare Recall At 8: 0.4044
45
  - Not Rare Precision At 15: 0.0270
46
  - Not Rare Recall At 15: 0.4044
47
- - Loss: 0.1049
48
 
49
  ## Model description
50
 
@@ -79,10 +79,10 @@ The following hyperparameters were used during training:
79
 
80
  | Training Loss | Epoch | Step | F1 Micro | F1 Macro | Precision At 5 | Recall At 5 | Precision At 8 | Recall At 8 | Precision At 15 | Recall At 15 | Rare F1 Micro | Rare F1 Macro | Rare Precision | Rare Recall | Rare Precision At 5 | Rare Recall At 5 | Rare Precision At 8 | Rare Recall At 8 | Rare Precision At 15 | Rare Recall At 15 | Not Rare F1 Micro | Not Rare F1 Macro | Not Rare Precision | Not Rare Recall | Not Rare Precision At 5 | Not Rare Recall At 5 | Not Rare Precision At 8 | Not Rare Recall At 8 | Not Rare Precision At 15 | Not Rare Recall At 15 | Validation Loss |
81
  |:-------------:|:------:|:----:|:--------:|:--------:|:--------------:|:-----------:|:--------------:|:-----------:|:---------------:|:------------:|:-------------:|:-------------:|:--------------:|:-----------:|:-------------------:|:----------------:|:-------------------:|:----------------:|:--------------------:|:-----------------:|:-----------------:|:-----------------:|:------------------:|:---------------:|:-----------------------:|:--------------------:|:-----------------------:|:--------------------:|:------------------------:|:---------------------:|:---------------:|
82
- | 0.6501 | 1.0 | 18 | 0.0 | 0.0 | 0.0118 | 0.0040 | 0.0202 | 0.0083 | 0.0230 | 0.0186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0147 | 0.0046 | 0.0184 | 0.0080 | 0.0206 | 0.0178 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.2199 |
83
- | 0.124 | 2.0 | 36 | 0.0 | 0.0 | 0.0206 | 0.0065 | 0.0202 | 0.0108 | 0.0186 | 0.0181 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0162 | 0.0070 | 0.0202 | 0.0109 | 0.0216 | 0.0197 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1224 |
84
- | 0.1083 | 3.0 | 54 | 0.0 | 0.0 | 0.0382 | 0.0155 | 0.0368 | 0.0233 | 0.0338 | 0.0364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0382 | 0.0153 | 0.0340 | 0.0203 | 0.0279 | 0.0289 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1074 |
85
- | 0.1035 | 3.7887 | 68 | 0.0 | 0.0 | 0.1324 | 0.0502 | 0.1066 | 0.0676 | 0.0789 | 0.0888 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0691 | 0.0306 | 0.0579 | 0.0355 | 0.0539 | 0.0584 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1049 |
86
 
87
 
88
  ### Framework versions
 
18
  It achieves the following results on the evaluation set:
19
  - F1 Micro: 0.0
20
  - F1 Macro: 0.0
21
+ - Precision At 5: 0.1412
22
+ - Recall At 5: 0.0496
23
+ - Precision At 8: 0.1131
24
+ - Recall At 8: 0.0628
25
  - Precision At 15: 0.0789
26
+ - Recall At 15: 0.0847
27
  - Rare F1 Micro: 0.0
28
  - Rare F1 Macro: 0.0
29
  - Rare Precision: 0.0
30
  - Rare Recall: 0.0
31
+ - Rare Precision At 5: 0.0838
32
+ - Rare Recall At 5: 0.0276
33
+ - Rare Precision At 8: 0.0671
34
+ - Rare Recall At 8: 0.0390
35
+ - Rare Precision At 15: 0.0549
36
+ - Rare Recall At 15: 0.0565
37
  - Not Rare F1 Micro: 0.5956
38
  - Not Rare F1 Macro: 0.3733
39
  - Not Rare Precision: 0.5956
 
44
  - Not Rare Recall At 8: 0.4044
45
  - Not Rare Precision At 15: 0.0270
46
  - Not Rare Recall At 15: 0.4044
47
+ - Loss: 0.1050
48
 
49
  ## Model description
50
 
 
79
 
80
  | Training Loss | Epoch | Step | F1 Micro | F1 Macro | Precision At 5 | Recall At 5 | Precision At 8 | Recall At 8 | Precision At 15 | Recall At 15 | Rare F1 Micro | Rare F1 Macro | Rare Precision | Rare Recall | Rare Precision At 5 | Rare Recall At 5 | Rare Precision At 8 | Rare Recall At 8 | Rare Precision At 15 | Rare Recall At 15 | Not Rare F1 Micro | Not Rare F1 Macro | Not Rare Precision | Not Rare Recall | Not Rare Precision At 5 | Not Rare Recall At 5 | Not Rare Precision At 8 | Not Rare Recall At 8 | Not Rare Precision At 15 | Not Rare Recall At 15 | Validation Loss |
81
  |:-------------:|:------:|:----:|:--------:|:--------:|:--------------:|:-----------:|:--------------:|:-----------:|:---------------:|:------------:|:-------------:|:-------------:|:--------------:|:-----------:|:-------------------:|:----------------:|:-------------------:|:----------------:|:--------------------:|:-----------------:|:-----------------:|:-----------------:|:------------------:|:---------------:|:-----------------------:|:--------------------:|:-----------------------:|:--------------------:|:------------------------:|:---------------------:|:---------------:|
82
+ | 0.5906 | 1.0 | 18 | 0.0 | 0.0 | 0.0221 | 0.0044 | 0.0239 | 0.0096 | 0.0284 | 0.0217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0235 | 0.0046 | 0.0276 | 0.0108 | 0.0279 | 0.0217 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.2073 |
83
+ | 0.1215 | 2.0 | 36 | 0.0 | 0.0 | 0.0382 | 0.0101 | 0.0358 | 0.0143 | 0.0304 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0382 | 0.0100 | 0.0331 | 0.0132 | 0.0304 | 0.0269 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1218 |
84
+ | 0.1081 | 3.0 | 54 | 0.0 | 0.0 | 0.0691 | 0.0225 | 0.0561 | 0.0276 | 0.0456 | 0.0394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0706 | 0.0244 | 0.0561 | 0.0291 | 0.0417 | 0.0370 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1073 |
85
+ | 0.1034 | 3.7887 | 68 | 0.0 | 0.0 | 0.1412 | 0.0496 | 0.1131 | 0.0628 | 0.0789 | 0.0847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0838 | 0.0276 | 0.0671 | 0.0390 | 0.0549 | 0.0565 | 0.5956 | 0.3733 | 0.5956 | 0.5956 | 0.0809 | 0.4044 | 0.0506 | 0.4044 | 0.0270 | 0.4044 | 0.1050 |
86
 
87
 
88
  ### Framework versions
eval_loss_plot.png CHANGED
eval_precision_at_15_plot.png CHANGED
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0643e8333623cfed408cad8650b0c17e49e2b09f853c0acadaa0fb60ddebc8aa
3
  size 4475046623
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3c6d5f892d7fc0071e730fc5e3fdc7d6acb1e0fc5dafd0b725cd748a3ade2068
3
  size 4475046623
train_loss_plot.png CHANGED
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7df822b4c1ba52da21038ef28c4bd858cb6bcbcecfe6522552a4f1c829d6b5f8
3
  size 5496
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b02e684f463b01edfeb7ab51b4c7d2a9fd23d8d212075aabfec2d95cd917b320
3
  size 5496