Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,41 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model:
|
4 |
+
- openai/gpt-oss-120b
|
5 |
+
language:
|
6 |
+
- zh
|
7 |
+
---
|
8 |
+
# GPT-OSS-ZhTW-Thinking
|
9 |
+
|
10 |
+
[](https://huggingface.co/FreeSEED-AI/gpt-oss-zhtw-thinking)
|
11 |
+
[](LICENSE)
|
12 |
+
|
13 |
+
A specialized language model optimized for thinking in Traditional Chinese (Taiwanese Mandarin).
|
14 |
+
|
15 |
+
## 🌟 Key Features
|
16 |
+
|
17 |
+
- **Native Taiwanese Mandarin Thinking**: Default reasoning and thinking patterns optimized for Traditional Chinese
|
18 |
+
- **Enhanced Cultural Understanding**: Deep comprehension of Taiwanese cultural contexts, idioms, and social nuances
|
19 |
+
- **GPT-based Architecture**: Standard GPT-OSS transformer architecture fine-tuned for zh-TW applications
|
20 |
+
|
21 |
+
## 📊 Model Specifications
|
22 |
+
|
23 |
+
- **Model Size**: 120B parameters
|
24 |
+
- **Architecture**: GPT-based MoE transformer
|
25 |
+
- **Training**: Fine-tuned for Traditional Chinese (zh-TW)
|
26 |
+
|
27 |
+
## 🚀 Usage
|
28 |
+
|
29 |
+
Serving with vllm[https://x.com/MaziyarPanahi/status/1955741905515323425] or sglang[https://github.com/sgl-project/sglang/issues/8833]
|
30 |
+
|
31 |
+
## 📝 License
|
32 |
+
|
33 |
+
This model is released under the Apache 2.0 License.
|
34 |
+
|
35 |
+
## 🤝 Contributing
|
36 |
+
|
37 |
+
We welcome contributions and feedback! Please open an issue or submit a pull request if you have suggestions for improvements.
|
38 |
+
|
39 |
+
---
|
40 |
+
|
41 |
+
*Made with ❤️ by FreeSEED-AI*
|