Philip May
commited on
Commit
·
e3cbd6d
1
Parent(s):
051fdfe
Update README.md
Browse files
README.md
CHANGED
|
@@ -64,6 +64,7 @@ This model is trained on the following datasets:
|
|
| 64 |
|-------|--------|--------|--------|----------
|
| 65 |
| [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 18.3607 | 5.3604 | 14.5456 | 16.1946
|
| 66 |
| [deutsche-telekom/mT5-small-sum-de-en-01](https://huggingface.co/deutsche-telekom/mt5-small-sum-de-en-v1) | 21.7336 | 7.2614 | 17.1323 | 19.3977
|
|
|
|
| 67 |
|
| 68 |
## Evaluation on CNN Daily English Test Set (no beams)
|
| 69 |
|
|
@@ -73,7 +74,7 @@ This model is trained on the following datasets:
|
|
| 73 |
| [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) | 28.5374 | 9.8565 | 19.4829 | 24.7364
|
| 74 |
| [mrm8488/t5-base-finetuned-summarize-news](https://huggingface.co/mrm8488/t5-base-finetuned-summarize-news) | 37.576 | 14.7389 | 24.0254 | 34.4634
|
| 75 |
| [deutsche-telekom/mT5-small-sum-de-en-01](https://huggingface.co/deutsche-telekom/mt5-small-sum-de-en-v1) | 37.6339 | 16.5317 | 27.1418 | 34.9951
|
| 76 |
-
|
| 77 |
|
| 78 |
## Evaluation on Extreme Summarization (XSum) English Test Set (no beams)
|
| 79 |
|
|
@@ -82,6 +83,7 @@ This model is trained on the following datasets:
|
|
| 82 |
| [mrm8488/t5-base-finetuned-summarize-news](https://huggingface.co/mrm8488/t5-base-finetuned-summarize-news) | 18.6204 | 3.535 | 12.3997 | 15.2111
|
| 83 |
| [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) | 28.5374 | 9.8565 | 19.4829 | 24.7364
|
| 84 |
| [deutsche-telekom/mT5-small-sum-de-en-01](https://huggingface.co/deutsche-telekom/mt5-small-sum-de-en-v1) | 32.3416 | 10.6191 | 25.3799 | 25.3908
|
|
|
|
| 85 |
| [sshleifer/distilbart-xsum-12-6](https://huggingface.co/sshleifer/distilbart-xsum-12-6) | 44.2553 ♣ | 21.4289 ♣ | 36.2639 ♣ | 36.2696 ♣
|
| 86 |
|
| 87 |
♣: These values seem to be unusually high. It could be that the test set was used in the training data.
|
|
|
|
| 64 |
|-------|--------|--------|--------|----------
|
| 65 |
| [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 18.3607 | 5.3604 | 14.5456 | 16.1946
|
| 66 |
| [deutsche-telekom/mT5-small-sum-de-en-01](https://huggingface.co/deutsche-telekom/mt5-small-sum-de-en-v1) | 21.7336 | 7.2614 | 17.1323 | 19.3977
|
| 67 |
+
| T-Systems-onsite/mt5-small-sum-de-en-v2 (this) | xxx | xxx | xxx | xxx
|
| 68 |
|
| 69 |
## Evaluation on CNN Daily English Test Set (no beams)
|
| 70 |
|
|
|
|
| 74 |
| [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) | 28.5374 | 9.8565 | 19.4829 | 24.7364
|
| 75 |
| [mrm8488/t5-base-finetuned-summarize-news](https://huggingface.co/mrm8488/t5-base-finetuned-summarize-news) | 37.576 | 14.7389 | 24.0254 | 34.4634
|
| 76 |
| [deutsche-telekom/mT5-small-sum-de-en-01](https://huggingface.co/deutsche-telekom/mt5-small-sum-de-en-v1) | 37.6339 | 16.5317 | 27.1418 | 34.9951
|
| 77 |
+
| T-Systems-onsite/mt5-small-sum-de-en-v2 (this) | xxx | xxx | xxx | xxx
|
| 78 |
|
| 79 |
## Evaluation on Extreme Summarization (XSum) English Test Set (no beams)
|
| 80 |
|
|
|
|
| 83 |
| [mrm8488/t5-base-finetuned-summarize-news](https://huggingface.co/mrm8488/t5-base-finetuned-summarize-news) | 18.6204 | 3.535 | 12.3997 | 15.2111
|
| 84 |
| [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) | 28.5374 | 9.8565 | 19.4829 | 24.7364
|
| 85 |
| [deutsche-telekom/mT5-small-sum-de-en-01](https://huggingface.co/deutsche-telekom/mt5-small-sum-de-en-v1) | 32.3416 | 10.6191 | 25.3799 | 25.3908
|
| 86 |
+
| T-Systems-onsite/mt5-small-sum-de-en-v2 (this) | xxx | xxx | xxx | xxx
|
| 87 |
| [sshleifer/distilbart-xsum-12-6](https://huggingface.co/sshleifer/distilbart-xsum-12-6) | 44.2553 ♣ | 21.4289 ♣ | 36.2639 ♣ | 36.2696 ♣
|
| 88 |
|
| 89 |
♣: These values seem to be unusually high. It could be that the test set was used in the training data.
|