Commit
·
1aa0195
1
Parent(s):
1565bfe
fix links
Browse files
README.md
CHANGED
|
@@ -243,10 +243,10 @@ TODO
|
|
| 243 |
🚧 work in progress 🚧
|
| 244 |
|
| 245 |
Training logs can be found in Tensorboard format in:
|
| 246 |
-
* [`metadata/training_logs/`](metadata/training_logs)
|
| 247 |
<br> ├── [`1_pretraining.zip`](metadata/training_logs/1_pretraining.zip) training logs for the first pre-training phases,
|
| 248 |
in a zip file. Each file in the zip corresponds to a job of at most 20H of training (parallelized over 512 GPUs).
|
| 249 |
-
<br> └── [`2_extension/`](metadata/training_logs/2_extension) folder containing the training log for the context extension phase, which was done in a single job of around 13H of training (parallelized over 128 GPUs).
|
| 250 |
|
| 251 |
## Acknowledgements
|
| 252 |
|
|
|
|
| 243 |
🚧 work in progress 🚧
|
| 244 |
|
| 245 |
Training logs can be found in Tensorboard format in:
|
| 246 |
+
* [`metadata/training_logs/`](https://huggingface.co/OpenLLM-France/Lucie-7B/tree/main/metadata/training_logs)
|
| 247 |
<br> ├── [`1_pretraining.zip`](metadata/training_logs/1_pretraining.zip) training logs for the first pre-training phases,
|
| 248 |
in a zip file. Each file in the zip corresponds to a job of at most 20H of training (parallelized over 512 GPUs).
|
| 249 |
+
<br> └── [`2_extension/`](https://huggingface.co/OpenLLM-France/Lucie-7B/tree/main/metadata/training_logs/2_extension) folder containing the training log for the context extension phase, which was done in a single job of around 13H of training (parallelized over 128 GPUs).
|
| 250 |
|
| 251 |
## Acknowledgements
|
| 252 |
|