Text Generation
Transformers
Safetensors
Czech
llama
text-generation-inference
mfajcik commited on
Commit
caa7831
·
verified ·
1 Parent(s): 3e82eb4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -7,4 +7,11 @@ language:
7
  ---
8
  # Introduction
9
  CSTinyLlama-1.2B is a Czech language model continously pretrained on 168b training tokens from English [TinyLLama-2.5T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
10
- Training was done on [Karolina](https://www.it4i.cz/en) cluster.
 
 
 
 
 
 
 
 
7
  ---
8
  # Introduction
9
  CSTinyLlama-1.2B is a Czech language model continously pretrained on 168b training tokens from English [TinyLLama-2.5T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
10
+ Training was done on [Karolina](https://www.it4i.cz/en) cluster.
11
+
12
+ # Loss
13
+ ## Train Cross-Entropy
14
+ <img src="figures/tllama_train.png" width="900"/>
15
+
16
+ ## Test Perplexity
17
+ <img src="figures/tllama_test.png" width="900"/>