End of training
Browse files
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss: 1.
|
23 |
-
- eval_runtime:
|
24 |
-
- eval_samples_per_second:
|
25 |
-
- eval_steps_per_second:
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,7 +45,7 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
@@ -56,38 +56,38 @@ The following hyperparameters were used during training:
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
-
Peak GPU Memory:
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
-
| 0 | 0 | 54069.2930 | 57285.3438 | 6.
|
66 |
-
| 1000 | 0.0404 |
|
67 |
-
| 2000 | 0.0808 |
|
68 |
-
| 3000 | 0.1212 |
|
69 |
-
| 4000 | 0.1616 |
|
70 |
-
| 5000 | 0.2020 |
|
71 |
-
| 6000 | 0.2424 |
|
72 |
-
| 7000 | 0.2828 |
|
73 |
-
| 8000 | 0.3232 |
|
74 |
-
| 9000 | 0.3636 |
|
75 |
-
| 10000 | 0.4040 |
|
76 |
-
| 11000 | 0.4444 |
|
77 |
-
| 12000 | 0.4848 |
|
78 |
-
| 13000 | 0.5253 |
|
79 |
-
| 14000 | 0.5657 |
|
80 |
-
| 15000 | 0.6061 |
|
81 |
-
| 16000 | 0.6465 |
|
82 |
-
| 17000 | 0.6869 |
|
83 |
-
| 18000 | 0.7273 |
|
84 |
-
| 19000 | 0.7677 |
|
85 |
-
| 20000 | 0.8081 |
|
86 |
-
| 21000 | 0.8485 |
|
87 |
-
| 22000 | 0.8889 |
|
88 |
-
| 23000 | 0.9293 |
|
89 |
-
| 24000 | 0.9697 |
|
90 |
-
| 24750 | 1.0 |
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 321.9943
|
20 |
+
- eval_frwikippl: 1758.7812
|
21 |
+
- eval_zhwikippl: 13365.0029
|
22 |
+
- eval_loss: 1.4163
|
23 |
+
- eval_runtime: 38.2164
|
24 |
+
- eval_samples_per_second: 52.334
|
25 |
+
- eval_steps_per_second: 6.542
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:cakld_loss()), activations_weight=0.2, activations_loss_fn=(fn:soft_mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:soft_mse_loss()))
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
+
Peak GPU Memory: 10.3900 GB
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
+
| 0 | 0 | 54069.2930 | 57285.3438 | 6.3621 | 38.3646 | 52.131 | 6.516 | 54227.1016 |
|
66 |
+
| 1000 | 0.0404 | 1067.6234 | 5608.6260 | 2.0789 | 38.0715 | 52.533 | 6.567 | 96472.8984 |
|
67 |
+
| 2000 | 0.0808 | 757.3948 | 3725.6042 | 1.9128 | 38.2045 | 52.35 | 6.544 | 17909.9004 |
|
68 |
+
| 3000 | 0.1212 | 658.3904 | 3416.6064 | 1.8048 | 38.242 | 52.299 | 6.537 | 7773.6802 |
|
69 |
+
| 4000 | 0.1616 | 560.8882 | 3119.1289 | 1.7206 | 38.3142 | 52.2 | 6.525 | 2095.8083 |
|
70 |
+
| 5000 | 0.2020 | 497.5148 | 2912.9402 | 1.6453 | 38.3234 | 52.187 | 6.523 | 1419.0811 |
|
71 |
+
| 6000 | 0.2424 | 431.3407 | 2770.3191 | 1.5753 | 38.2865 | 52.238 | 6.53 | 1246.3398 |
|
72 |
+
| 7000 | 0.2828 | 380.4417 | 2245.9678 | 1.5120 | 38.2419 | 52.299 | 6.537 | 2103.0974 |
|
73 |
+
| 8000 | 0.3232 | 348.0757 | 1937.6904 | 1.4605 | 38.1726 | 52.394 | 6.549 | 1825.5139 |
|
74 |
+
| 9000 | 0.3636 | 321.9943 | 1758.7812 | 1.4163 | 38.2164 | 52.334 | 6.542 | 13365.0029 |
|
75 |
+
| 10000 | 0.4040 | 302.1302 | 1743.9637 | 1.3804 | 38.2909 | 52.232 | 6.529 | 1458.2706 |
|
76 |
+
| 11000 | 0.4444 | 283.6677 | 1578.4840 | 1.3479 | 38.0626 | 52.545 | 6.568 | 2750.6077 |
|
77 |
+
| 12000 | 0.4848 | 270.2290 | 1480.8057 | 1.3235 | 38.2617 | 52.272 | 6.534 | 1864.9377 |
|
78 |
+
| 13000 | 0.5253 | 253.0666 | 1502.0465 | 1.2928 | 38.0543 | 52.556 | 6.57 | 5540.5454 |
|
79 |
+
| 14000 | 0.5657 | 241.2089 | 1442.4714 | 1.2666 | 38.1426 | 52.435 | 6.554 | 1532.5410 |
|
80 |
+
| 15000 | 0.6061 | 225.8018 | 1385.6504 | 1.2418 | 38.089 | 52.509 | 6.564 | 984.5054 |
|
81 |
+
| 16000 | 0.6465 | 219.4576 | 1220.6696 | 1.2177 | 38.2186 | 52.33 | 6.541 | 876.7557 |
|
82 |
+
| 17000 | 0.6869 | 208.0084 | 1206.6365 | 1.1978 | 38.223 | 52.324 | 6.541 | 1094.0415 |
|
83 |
+
| 18000 | 0.7273 | 204.6121 | 1193.9426 | 1.1802 | 38.2118 | 52.34 | 6.542 | 1641.8646 |
|
84 |
+
| 19000 | 0.7677 | 197.2790 | 1129.5767 | 1.1681 | 38.2359 | 52.307 | 6.538 | 693.8603 |
|
85 |
+
| 20000 | 0.8081 | 191.9000 | 1025.8606 | 1.1505 | 38.008 | 52.621 | 6.578 | 992.1599 |
|
86 |
+
| 21000 | 0.8485 | 190.8004 | 1014.3529 | 1.1373 | 38.0057 | 52.624 | 6.578 | 1086.7607 |
|
87 |
+
| 22000 | 0.8889 | 185.3244 | 978.5265 | 1.1308 | 38.0856 | 52.513 | 6.564 | 728.6151 |
|
88 |
+
| 23000 | 0.9293 | 185.4108 | 969.4617 | 1.1227 | 38.0654 | 52.541 | 6.568 | 623.9739 |
|
89 |
+
| 24000 | 0.9697 | 178.5300 | 955.3481 | 1.1107 | 38.1566 | 52.416 | 6.552 | 568.7457 |
|
90 |
+
| 24750 | 1.0 | 179.6843 | 993.8232 | 1.1067 | 38.1733 | 52.393 | 6.549 | 685.6628 |
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
logs/distillation_objective=MultiObjective(logits_weight_1__logits_loss_fn_(fn_cakld_loss())__activations_weight_0.2__activations_loss_fn_(fn_soft_mse_loss())__attentions_weight_0__attentions_loss_fn_(fn_s/events.out.tfevents.1723528003.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5464dff160ade15d2520075da40a1ea6d40dbcc6617635026708f5e390e50abc
|
3 |
+
size 253
|