Update README.md
Browse files
README.md
CHANGED
@@ -213,6 +213,13 @@ See [Red Hat Openshift AI documentation](https://docs.redhat.com/en/documentatio
|
|
213 |
</details>
|
214 |
|
215 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
216 |
## Evaluation
|
217 |
|
218 |
The model was evaluated on popular reasoning tasks (AIME 2024, MATH-500, GPQA-Diamond) via [LightEval](https://github.com/huggingface/open-r1).
|
|
|
213 |
</details>
|
214 |
|
215 |
|
216 |
+
## Creation
|
217 |
+
|
218 |
+
We created this model using **MoE-Quant**, a library developed jointly with **ISTA** and tailored for the quantization of very large Mixture-of-Experts (MoE) models.
|
219 |
+
|
220 |
+
For more details, please refer to the [MoE-Quant repository](https://github.com/IST-DASLab/MoE-Quant).
|
221 |
+
|
222 |
+
|
223 |
## Evaluation
|
224 |
|
225 |
The model was evaluated on popular reasoning tasks (AIME 2024, MATH-500, GPQA-Diamond) via [LightEval](https://github.com/huggingface/open-r1).
|