Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# Pre-trained Conformer-CTC models for the librispeech dataset with icefall.
|
| 2 |
The model was trained on full [LibriSpeech](http://openslr.org/12/) with the scripts in [icefall](https://github.com/k2-fsa/icefall).
|
| 3 |
See (https://github.com/k2-fsa/icefall/pull/13) for more details of this model.
|
|
@@ -39,4 +45,4 @@ The best decoding results (WERs) on LibriSpeech test-clean and test-other are li
|
|
| 39 |
|
| 40 |
||test-clean|test-other|
|
| 41 |
|--|--|--|
|
| 42 |
-
|WER|2.57%|5.94%|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
# Pre-trained Conformer-CTC models for the librispeech dataset with icefall.
|
| 8 |
The model was trained on full [LibriSpeech](http://openslr.org/12/) with the scripts in [icefall](https://github.com/k2-fsa/icefall).
|
| 9 |
See (https://github.com/k2-fsa/icefall/pull/13) for more details of this model.
|
|
|
|
| 45 |
|
| 46 |
||test-clean|test-other|
|
| 47 |
|--|--|--|
|
| 48 |
+
|WER|2.57%|5.94%|
|