Update README.md
Browse files
README.md
CHANGED
@@ -23,13 +23,16 @@ tags:
|
|
23 |
- qwen2moe
|
24 |
- 2X32B Shared.
|
25 |
- shared expert
|
|
|
26 |
---
|
27 |
|
28 |
(uploading...)
|
29 |
|
30 |
<h2>Qwen2.5-2X32B-CoderInstruct-OlympicCoder-80B</h2>
|
31 |
|
32 |
-
|
|
|
|
|
33 |
|
34 |
The two best Coders in one.
|
35 |
|
@@ -43,4 +46,4 @@ and/or
|
|
43 |
|
44 |
https://huggingface.co/open-r1/OlympicCoder-32B
|
45 |
|
46 |
-
More to come...
|
|
|
23 |
- qwen2moe
|
24 |
- 2X32B Shared.
|
25 |
- shared expert
|
26 |
+
library_name: transformers
|
27 |
---
|
28 |
|
29 |
(uploading...)
|
30 |
|
31 |
<h2>Qwen2.5-2X32B-CoderInstruct-OlympicCoder-80B</h2>
|
32 |
|
33 |
+
This repo contains the full precision source code, in "safe tensors" format to generate GGUFs, GPTQ, EXL2, AWQ, HQQ and other formats. The source code can also be used directly.
|
34 |
+
|
35 |
+
The monster coder in MOE (Mixture of Experts) 2x32B (with shared expert) configuation.
|
36 |
|
37 |
The two best Coders in one.
|
38 |
|
|
|
46 |
|
47 |
https://huggingface.co/open-r1/OlympicCoder-32B
|
48 |
|
49 |
+
More to come...
|