Add relevant metadata, link to Github repository and specify license.
#1
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -1,3 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
This is a 800M parameters model pre-trained with [QuEST](https://arxiv.org/abs/2502.05003) over 80B C4 tokens in 2:4 sparse INT4 format.
|
| 2 |
|
| 3 |
-
The code to verify that this model works in INT4 can be found [here](https://github.com/IST-DASLab/QuEST/blob/main/src/HadamardFourEightTesting.ipynb).
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
library_name: transformers
|
| 3 |
+
pipeline_tag: text-generation
|
| 4 |
+
license: apache-2.0
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
This is a 800M parameters model pre-trained with [QuEST](https://arxiv.org/abs/2502.05003) over 80B C4 tokens in 2:4 sparse INT4 format.
|
| 8 |
|
| 9 |
+
The code to verify that this model works in INT4 can be found [here](https://github.com/IST-DASLab/QuEST/blob/main/src/HadamardFourEightTesting.ipynb).
|
| 10 |
+
|
| 11 |
+
Github repository: https://github.com/IST-DASLab/QuEST
|