Update README.md
Browse files
README.md
CHANGED
@@ -3,4 +3,6 @@ datasets:
|
|
3 |
- cerebras/SlimPajama-627B
|
4 |
language:
|
5 |
- en
|
6 |
-
---
|
|
|
|
|
|
3 |
- cerebras/SlimPajama-627B
|
4 |
language:
|
5 |
- en
|
6 |
+
---
|
7 |
+
|
8 |
+
This is the trained 1.3 billion parameter LLAMA-2 architecture model for the work 【Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining】(https://arxiv.org/pdf/2410.08102)
|