lapp0 commited on
Commit
b9aa37b
·
verified ·
1 Parent(s): 437295c

End of training

Browse files
README.md CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
- - eval_enwikippl: 211.0345
20
- - eval_frwikippl: 1207.8281
21
- - eval_zhwikippl: 585.1553
22
- - eval_loss: 1.2644
23
- - eval_runtime: 34.8133
24
- - eval_samples_per_second: 57.449
25
- - eval_steps_per_second: 7.181
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,7 +45,7 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0.5, activations_loss_fn=(fn:soft_mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:soft_mse_loss()))
49
  - train_embeddings: True
50
  - learning_rate: 4e-05
51
  - train_batch_size: 8
@@ -62,32 +62,32 @@ Peak GPU Memory: 8.0873 GB
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
65
- | 0 | 0 | 54069.2930 | 57285.3438 | 5.9282 | 34.854 | 57.382 | 7.173 | 54227.1016 |
66
- | 1000 | 0.0404 | 715.9882 | 4680.4160 | 1.9682 | 34.7166 | 57.609 | 7.201 | 17073.8301 |
67
- | 2000 | 0.0808 | 511.1823 | 3222.4033 | 1.7803 | 35.3145 | 56.634 | 7.079 | 1840.1992 |
68
- | 3000 | 0.1212 | 423.7381 | 2758.2358 | 1.6673 | 34.6413 | 57.735 | 7.217 | 1143.0259 |
69
- | 4000 | 0.1616 | 369.2649 | 2376.2915 | 1.5791 | 34.703 | 57.632 | 7.204 | 849.1024 |
70
- | 5000 | 0.2020 | 318.7353 | 1859.0007 | 1.4983 | 34.7285 | 57.59 | 7.199 | 896.6478 |
71
- | 6000 | 0.2424 | 278.9278 | 1626.8433 | 1.4235 | 34.7261 | 57.594 | 7.199 | 1273.9360 |
72
- | 7000 | 0.2828 | 254.5844 | 1463.7820 | 1.3663 | 34.6733 | 57.681 | 7.21 | 1229.8070 |
73
- | 8000 | 0.3232 | 230.2103 | 1278.4543 | 1.3125 | 34.4883 | 57.991 | 7.249 | 966.0128 |
74
- | 9000 | 0.3636 | 211.0345 | 1207.8281 | 1.2644 | 34.8133 | 57.449 | 7.181 | 585.1553 |
75
- | 10000 | 0.4040 | 196.6672 | 1174.5717 | 1.2176 | 34.9053 | 57.298 | 7.162 | 530.0967 |
76
- | 11000 | 0.4444 | 177.6311 | 1018.0782 | 1.1662 | 34.8097 | 57.455 | 7.182 | 773.5347 |
77
- | 12000 | 0.4848 | 164.7556 | 925.2521 | 1.1256 | 34.6156 | 57.777 | 7.222 | 547.3607 |
78
- | 13000 | 0.5253 | 154.6037 | 854.5158 | 1.0956 | 34.4798 | 58.005 | 7.251 | 626.4785 |
79
- | 14000 | 0.5657 | 146.6518 | 793.8755 | 1.0654 | 34.5931 | 57.815 | 7.227 | 670.8995 |
80
- | 15000 | 0.6061 | 142.1114 | 773.8150 | 1.0480 | 34.5927 | 57.816 | 7.227 | 556.6484 |
81
- | 16000 | 0.6465 | 140.9245 | 710.7372 | 1.0307 | 34.5966 | 57.809 | 7.226 | 648.1790 |
82
- | 17000 | 0.6869 | 135.3474 | 722.6113 | 1.0181 | 34.5449 | 57.896 | 7.237 | 510.9829 |
83
- | 18000 | 0.7273 | 133.1789 | 697.9252 | 1.0046 | 34.4453 | 58.063 | 7.258 | 526.8502 |
84
- | 19000 | 0.7677 | 130.4562 | 684.9579 | 0.9941 | 34.4973 | 57.976 | 7.247 | 361.4319 |
85
- | 20000 | 0.8081 | 128.2863 | 676.7495 | 0.9875 | 34.5351 | 57.912 | 7.239 | 392.1580 |
86
- | 21000 | 0.8485 | 126.9386 | 645.8423 | 0.9779 | 34.5188 | 57.939 | 7.242 | 482.4038 |
87
- | 22000 | 0.8889 | 125.7417 | 615.3482 | 0.9695 | 34.6552 | 57.711 | 7.214 | 353.2721 |
88
- | 23000 | 0.9293 | 124.2566 | 641.0788 | 0.9649 | 34.5465 | 57.893 | 7.237 | 434.2212 |
89
- | 24000 | 0.9697 | 121.7920 | 623.3393 | 0.9582 | 34.5651 | 57.862 | 7.233 | 437.1886 |
90
- | 24750 | 1.0 | 120.7186 | 650.8248 | 0.9533 | 34.6549 | 57.712 | 7.214 | 447.5255 |
91
 
92
  ### Framework versions
93
  - Distily 0.2.0
 
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 4719.9468
20
+ - eval_frwikippl: 32493.3301
21
+ - eval_zhwikippl: 69664.9141
22
+ - eval_loss: 0.0000
23
+ - eval_runtime: 33.0913
24
+ - eval_samples_per_second: 60.439
25
+ - eval_steps_per_second: 7.555
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:soft_mse_loss()), activations_weight=0.2, activations_loss_fn=(fn:soft_mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:soft_mse_loss()))
49
  - train_embeddings: True
50
  - learning_rate: 4e-05
51
  - train_batch_size: 8
 
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
65
+ | 0 | 0 | 54069.2930 | 57285.3438 | 0.0002 | 32.966 | 60.669 | 7.584 | 54227.1016 |
66
+ | 1000 | 0.0404 | 55992.1211 | 52749.4258 | 0.0000 | 32.8638 | 60.857 | 7.607 | 51091.7461 |
67
+ | 2000 | 0.0808 | 54347.1094 | 49241.5898 | 0.0000 | 32.9025 | 60.786 | 7.598 | 46894.0781 |
68
+ | 3000 | 0.1212 | 60147.9336 | 63936.7148 | 0.0000 | 32.8967 | 60.796 | 7.6 | 51214.7266 |
69
+ | 4000 | 0.1616 | 42812.9375 | 51130.9336 | 0.0000 | 33.1378 | 60.354 | 7.544 | 50346.7188 |
70
+ | 5000 | 0.2020 | 17108.9082 | 60746.3047 | 0.0000 | 33.0827 | 60.455 | 7.557 | 148262.7188 |
71
+ | 6000 | 0.2424 | 9136.8545 | 49172.2344 | 0.0000 | 33.0068 | 60.594 | 7.574 | 84255.5703 |
72
+ | 7000 | 0.2828 | 7202.7583 | 43470.5586 | 0.0000 | 32.9219 | 60.75 | 7.594 | 73506.8906 |
73
+ | 8000 | 0.3232 | 5990.6094 | 41499.9180 | 0.0000 | 32.9592 | 60.681 | 7.585 | 66661.625 |
74
+ | 9000 | 0.3636 | 4719.9468 | 32493.3301 | 0.0000 | 33.0913 | 60.439 | 7.555 | 69664.9141 |
75
+ | 10000 | 0.4040 | 4432.2080 | 31948.1348 | 0.0000 | 33.0901 | 60.441 | 7.555 | 60131.8164 |
76
+ | 11000 | 0.4444 | 4204.5664 | 29618.2324 | 0.0000 | 32.9441 | 60.709 | 7.589 | 55858.7305 |
77
+ | 12000 | 0.4848 | 4062.7026 | 26250.4629 | 0.0000 | 33.1515 | 60.329 | 7.541 | 44848.4922 |
78
+ | 13000 | 0.5253 | 3902.2219 | 24349.8223 | 0.0000 | 33.2741 | 60.107 | 7.513 | 47638.8672 |
79
+ | 14000 | 0.5657 | 3806.1567 | 24284.6562 | 0.0000 | 32.8459 | 60.89 | 7.611 | 47715.3008 |
80
+ | 15000 | 0.6061 | 3608.1462 | 23331.3301 | 0.0000 | 32.815 | 60.948 | 7.618 | 48086.2812 |
81
+ | 16000 | 0.6465 | 3466.9666 | 20585.3301 | 0.0000 | 33.0344 | 60.543 | 7.568 | 43052.8281 |
82
+ | 17000 | 0.6869 | 3336.7512 | 20769.0137 | 0.0000 | 33.0359 | 60.54 | 7.568 | 34181.1719 |
83
+ | 18000 | 0.7273 | 3264.7285 | 21242.9316 | 0.0000 | 32.9309 | 60.733 | 7.592 | 37822.1914 |
84
+ | 19000 | 0.7677 | 3184.1082 | 20406.1504 | 0.0000 | 32.8054 | 60.966 | 7.621 | 38045.1094 |
85
+ | 20000 | 0.8081 | 3149.6814 | 19682.6367 | 0.0000 | 32.8725 | 60.841 | 7.605 | 40999.2188 |
86
+ | 21000 | 0.8485 | 3102.1042 | 19721.5312 | 0.0000 | 32.8951 | 60.799 | 7.6 | 40824.3867 |
87
+ | 22000 | 0.8889 | 3037.9785 | 19802.3457 | 0.0000 | 32.8317 | 60.917 | 7.615 | 36668.4219 |
88
+ | 23000 | 0.9293 | 3005.1289 | 18750.6465 | 0.0000 | 32.8704 | 60.845 | 7.606 | 36983.1289 |
89
+ | 24000 | 0.9697 | 2956.0625 | 18495.9297 | 0.0000 | 32.8962 | 60.797 | 7.6 | 31172.4219 |
90
+ | 24750 | 1.0 | 2937.7556 | 18456.8477 | 0.0000 | 33.023 | 60.564 | 7.57 | 36375.7812 |
91
 
92
  ### Framework versions
93
  - Distily 0.2.0
logs/distillation_objective=MultiObjective(logits_weight_1__logits_loss_fn_(fn_soft_mse_loss())__activations_weight_0.2__activations_loss_fn_(fn_soft_mse_loss())__attentions_weight_0__attentions_loss_fn_(f/events.out.tfevents.1723509557.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:81df1bb11db3e8c40594ad371f7650d1361f12e2a5a0475805682abca3b43e99
3
+ size 253