danielhanchen commited on
Commit
fe99879
·
verified ·
1 Parent(s): 4b2d92f

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
37
+ thinking_budget.png filter=lfs diff=lfs merge=lfs -text
LICENSE.txt ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
177
+
178
+ APPENDIX: How to apply the Apache License to your work.
179
+
180
+ To apply the Apache License to your work, attach the following
181
+ boilerplate notice, with the fields enclosed by brackets "[]"
182
+ replaced with your own identifying information. (Don't include
183
+ the brackets!) The text should be enclosed in the appropriate
184
+ comment syntax for the file format. We also recommend that a
185
+ file or class name and description of purpose be included on the
186
+ same "printed page" as the copyright notice for easier
187
+ identification within third-party archives.
188
+
189
+ Copyright [2025] [name of copyright owner]
190
+
191
+ Licensed under the Apache License, Version 2.0 (the "License");
192
+ you may not use this file except in compliance with the License.
193
+ You may obtain a copy of the License at
194
+
195
+ http://www.apache.org/licenses/LICENSE-2.0
196
+
197
+ Unless required by applicable law or agreed to in writing, software
198
+ distributed under the License is distributed on an "AS IS" BASIS,
199
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200
+ See the License for the specific language governing permissions and
201
+ limitations under the License.
MODEL_CARD.md ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ This model is released to foster global open research and empower developers.
2
+
3
+ ## Model Details
4
+ - Model Name: `Seed-OSS`
5
+ - Model Type/Structure: Causal language model
6
+ - Model Version:
7
+ - `Seed-OSS-36B-Base`
8
+ - `Seed-OSS-36B-Base-woSyn`
9
+ - `Seed-OSS-36B-Instruct`
10
+ - Context Length: 512K (524,288)
11
+ - Developed by: ByteDance Seed Team
12
+ - Release Date: Aug 20, 2025
13
+ - License: Apache 2.0
14
+ - Contact: [[email protected]](mailto:[email protected])
15
+
16
+ ## Intended Use and Limitations
17
+ **Intended Uses**:
18
+ As a general purpose model, Seed-OSS can support multiple use cases, including question answering, summarization, reasoning. It takes text as input and generates text as output. The following list is just some limited examples of the potential use cases:
19
+ - Underlying technology for chatbots and conversational AI.
20
+ - Assistance with content creation (e.g., drafting, summarizing, editing).
21
+ - Auxiliary information retrieval in non-critical scenarios.
22
+
23
+ **Prohibited Uses**:
24
+ - **Professional medical, legal, or financial advice**.
25
+ - **Automatic decision-making**: Used for high-risk, high-impact automatic decision-making unless it has been rigorously fine-tuned and evaluated by humans for that specific use case.
26
+ - **Minor Safety**: Abuse, exploit, or harm a minor or individual under the age of consent, including grooming, or child sexual exploitation
27
+ - **Illegal Activities**: Violating any laws, including fraud, terrorism, or generating Child Sexual Abuse Material (CSAM).
28
+ - **Hate & Harassment**: Generating discriminatory, hateful, or harassing content.
29
+ - **Misinformation**: Engaging in disinformation, misinformation, or deceptive activities, including but not limited to passing off or representing AI-generated content as human-generated.
30
+ - **Military & Surveillance**: Any use for military, weaponry, intelligence gathering, or mass surveillance purposes.
31
+ - **High-Risk Harm**: Generating sexually explicit material, violating privacy (e.g., PII).
32
+
33
+ **Limitations**:
34
+ - **Languages**: Seed-OSS is an **international (i18n)** model. It is primarily optimized for and evaluated in **English**. Performance in other languages is limited and not robustly tested.
35
+ - **Training Data**: The training data is predominantly from publicly available sources, the models understanding of specific cultures, values and historical events may be incomplete.
36
+ - **Hallucination**: Models generate content based on the statistical patterns in their training data. The model may generate information that is incorrect or entirely fictional.
37
+ - **Harmful Content Generation**: Like any large language model, Seed-OSS may still be capable of producing outputs which are considered harmful, inaccurate or offensive—despite extensive safety training. We encourage developers to conduct their own testing, ensure human oversight and deploy mitigation strategies for their specific use cases.
38
+
39
+ For developers who intend to build customized models or applications on top of this model, please be aware that you are fully responsible for the customized model or your application, and ensuring your application is compliant with all applicable laws.
40
+
41
+ Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.
42
+
43
+ ## Content Safety Measures
44
+ We designed relevant measures to ensure the model's content safety throughout the entire model training cycle, ranging from training data preparation to model training and evaluation. Prior to launch, Seed-OSS went through numerous safety and security reviews led by a global Safety and Security team based in Singapore. This includes:
45
+ - **Training Data Filtering**: Data cleaning and content filtering mechanism is designed and executed to ensure that no CSAM or highly toxic content data is included in the training dataset. PII removal is also performed through a combination of algorithmic and manual checks.
46
+ - **Safety Fine-Tuning**: Safety training is executed during the Supervised Fine-Tuning (SFT) and Reinforcement Learning from Human Feedback (RLHF/PPO) training stages to minimize the likelihood of harmful outputs.
47
+ - **Evaluation**: Regular safety testing and adversarial testing are conducted to identify and address safety vulnerabilities.
48
+
49
+ ## Training Data
50
+ Seed-OSS is trained on text data from multiple sources, which includes publicly available data on the internet, purchased data through the partnership with external vendors and data generated by our in-house teams. The model is pre-trained over 12 trillion tokens. The model has a knowledge cutoff of 07/2024.
51
+
52
+ Multiple measures have been taken to do data preprocessing, including data deduplication, data desensitization, quality filtering, CSAM filtering and toxic content filtering.
README.md ADDED
@@ -0,0 +1,632 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - ByteDance-Seed/Seed-OSS-36B-Instruct
4
+ license: apache-2.0
5
+ pipeline_tag: text-generation
6
+ library_name: transformers
7
+ tags:
8
+ - vllm
9
+ - unsloth
10
+ ---
11
+
12
+ <div align="center">
13
+ 👋 Hi, everyone!
14
+ <br>
15
+ We are <b>ByteDance Seed Team.</b>
16
+ </div>
17
+
18
+ <p align="center">
19
+ You can get to know us better through the following channels👇
20
+ <br>
21
+ <a href="https://seed.bytedance.com/">
22
+ <img src="https://img.shields.io/badge/Website-%231e37ff?style=for-the-badge&logo=bytedance&logoColor=white"></a>
23
+ </p>
24
+
25
+ ![seed logo](https://github.com/user-attachments/assets/c42e675e-497c-4508-8bb9-093ad4d1f216)
26
+
27
+
28
+ # Seed-OSS Open-Source Models
29
+ <p align="center">
30
+ <a href="https://github.com/ByteDance-Seed/seed-oss">
31
+ <img src="https://img.shields.io/badge/Seed-Project Page-yellow"></a>
32
+ <a href="https://github.com/ByteDance-Seed/seed-oss">
33
+ <img src="https://img.shields.io/badge/Seed-Tech Report Coming Soon-red"></a>
34
+ <a href="https://huggingface.co/ByteDance-Seed">
35
+ <img src="https://img.shields.io/badge/Seed-Hugging Face-orange"></a>
36
+ <br>
37
+ <a href="./LICENSE">
38
+ <img src="https://img.shields.io/badge/License-Apache2.0-blue"></a>
39
+ </p>
40
+
41
+ > [!NOTE]
42
+ > This model card is dedicated to the `Seed-OSS-36B-Instruct` model.
43
+
44
+ ## News
45
+ - [2025/08/20]🔥We release `Seed-OSS-36B-Base` (both with and without synthetic data versions) and `Seed-OSS-36B-Instruct`.
46
+
47
+ ## Introduction
48
+ Seed-OSS is a series of open-source large language models developed by ByteDance's Seed Team, designed for powerful long-context, reasoning, agent and general capabilities, and versatile developer-friendly features. Although trained with only 12T tokens, Seed-OSS achieves excellent performance on several popular open benchmarks.
49
+
50
+ We release this series of models to the open-source community under the Apache-2.0 license.
51
+
52
+ > [!NOTE]
53
+ > Seed-OSS is primarily optimized for international (i18n) use cases.
54
+
55
+ ### Key Features
56
+ - **Flexible Control of Thinking Budget**: Allowing users to flexibly adjust the reasoning length as needed. This capability of dynamically controlling the reasoning length enhances inference efficiency in practical application scenarios.
57
+ - **Enhanced Reasoning Capability**: Specifically optimized for reasoning tasks while maintaining balanced and excellent general capabilities.
58
+ - **Agentic Intelligence**: Performs exceptionally well in agentic tasks such as tool-using and issue resolving.
59
+ - **Research-Friendly**: Given that the inclusion of synthetic instruction data in pre-training may affect the post-training research, we released pre-trained models both with and without instruction data, providing the research community with more diverse options.
60
+ - **Native Long Context**: Trained with up-to-512K long context natively.
61
+
62
+ ### Model Summary
63
+
64
+ Seed-OSS adopts the popular causal language model architecture with RoPE, GQA attention, RMSNorm and SwiGLU activation.
65
+
66
+ <div align="center">
67
+
68
+ | | |
69
+ |:---:|:---:|
70
+ | | **Seed-OSS-36B** |
71
+ | **Parameters** | 36B |
72
+ | **Attention** | GQA |
73
+ | **Activation Function** | SwiGLU |
74
+ | **Number of Layers** | 64 |
75
+ | **Number of QKV Heads** | 80 / 8 / 8 |
76
+ | **Head Size** | 128 |
77
+ | **Hidden Size** | 5120 |
78
+ | **Vocabulary Size** | 155K |
79
+ | **Context Length** | 512K |
80
+ | **RoPE Base Frequency** | 1e7 |
81
+
82
+ </div>
83
+
84
+
85
+ ## Evaluation Results
86
+
87
+ ### Seed-OSS-36B-Base
88
+
89
+ Incorporating synthetic instruction data into pretraining leads to improved performance on most benchmarks. We adopt the version augmented with synthetic instruction data (i.e., *w/ syn.*) as `Seed-OSS-36B-Base`. We also release `Seed-OSS-36B-Base-woSyn` trained without such data (i.e., *w/o syn.*), offering the community a high-performance foundation model unaffected by synthetic instruction data.
90
+
91
+ <div align="center">
92
+ <table>
93
+ <thead>
94
+ <tr>
95
+ <th align="center">Benchmark</th>
96
+ <th align="center"><sup><a href="https://seed.bytedance.com/en/seed1_6">Seed1.6-Base</a></sup></th>
97
+ <th align="center"><sup>Qwen3-30B-A3B-Base-2507*</sup></th>
98
+ <th align="center"><sup>Qwen2.5-32B-Base*</sup></th>
99
+ <th align="center"><sup>Seed-OSS-36B-Base<br>(<i>w/ syn.</i>)</sup></th>
100
+ <th align="center"><sup>Seed-OSS-36B-Base-woSyn<br>(<i>w/o syn.</i>)</sup></th>
101
+ </tr>
102
+ </thead>
103
+ <tbody>
104
+ <tr>
105
+ <td align="center" colspan=6><strong>Knowledge</strong></td>
106
+ </tr>
107
+ <tr>
108
+ <td align="center">MMLU-Pro</td>
109
+ <td align="center">70</td>
110
+ <td align="center">59.8</td>
111
+ <td align="center">58.5 (55.1)</td>
112
+ <td align="center"><b>65.1</b></td>
113
+ <td align="center">60.4</td>
114
+ </tr>
115
+ <tr>
116
+ <td align="center">MMLU</td>
117
+ <td align="center">88.8</td>
118
+ <td align="center">82.7</td>
119
+ <td align="center">84 (83.3)</td>
120
+ <td align="center"><b>84.9</b></td>
121
+ <td align="center">84.8</td>
122
+ </tr>
123
+ <tr>
124
+ <td align="center">TriviaQA</td>
125
+ <td align="center">91</td>
126
+ <td align="center">76.2</td>
127
+ <td align="center">76</td>
128
+ <td align="center"><b>82.1</b></td>
129
+ <td align="center">81.9</td>
130
+ </tr>
131
+ <tr>
132
+ <td align="center">GPQA-D</td>
133
+ <td align="center">43.4</td>
134
+ <td align="center"><b>37</b></td>
135
+ <td align="center">29.3</td>
136
+ <td align="center">31.7</td>
137
+ <td align="center">35.2</td>
138
+ </tr>
139
+ <tr>
140
+ <td align="center">SimpleQA</td>
141
+ <td align="center">17.1</td>
142
+ <td align="center">7.2</td>
143
+ <td align="center">6.1</td>
144
+ <td align="center">5.8</td>
145
+ <td align="center"><b>7.4</b></td>
146
+ </tr>
147
+
148
+ <tr>
149
+ <td align="center" colspan=6><strong>Reasoning</strong></td>
150
+ </tr>
151
+ <tr>
152
+ <td align="center">BBH</td>
153
+ <td align="center">92.1</td>
154
+ <td align="center">81.4</td>
155
+ <td align="center">79.1 (84.5)</td>
156
+ <td align="center"><b>87.7</b></td>
157
+ <td align="center">87.2</td>
158
+ </tr>
159
+ <tr>
160
+ <td align="center">AGIEval-en</td>
161
+ <td align="center">78</td>
162
+ <td align="center">66.4</td>
163
+ <td align="center">65.6</td>
164
+ <td align="center"><b>70.7</b></td>
165
+ <td align="center">70.1</td>
166
+ </tr>
167
+
168
+ <tr>
169
+ <td align="center" colspan=6><strong>Math</strong></td>
170
+ </tr>
171
+ <tr>
172
+ <td align="center">GSM8K</td>
173
+ <td align="center">93.1</td>
174
+ <td align="center">87</td>
175
+ <td align="center">87.5 (92.9)</td>
176
+ <td align="center"><b>90.8</b></td>
177
+ <td align="center">90.3</td>
178
+ </tr>
179
+ <tr>
180
+ <td align="center">MATH</td>
181
+ <td align="center">72.9</td>
182
+ <td align="center">61.1</td>
183
+ <td align="center">63.5 (57.7)</td>
184
+ <td align="center"><b>81.7</b></td>
185
+ <td align="center">61.3</td>
186
+ </tr>
187
+
188
+ <tr>
189
+ <td align="center" colspan=6><strong>Coding</strong></td>
190
+ </tr>
191
+ <tr>
192
+ <td align="center">MBPP</td>
193
+ <td align="center">83.6</td>
194
+ <td align="center">78.8</td>
195
+ <td align="center">77.8 (84.5)</td>
196
+ <td align="center"><b>80.6</b></td>
197
+ <td align="center">74.6</td>
198
+ </tr>
199
+ <tr>
200
+ <td align="center">HumanEval</td>
201
+ <td align="center">78</td>
202
+ <td align="center">70.7</td>
203
+ <td align="center">47.6 (58.5)</td>
204
+ <td align="center"><b>76.8</b></td>
205
+ <td align="center">75.6</td>
206
+ </tr>
207
+ </tbody>
208
+ </table>
209
+ </div>
210
+
211
+ <sup>
212
+ - <b>Bold</b> denotes open-source SOTA.
213
+ </sup><br/><sup>
214
+ - "*" indicates that the results in this column are presented in the format of "reproduced_results (reported_results_if_any)".
215
+ </sup>
216
+
217
+ ### Seed-OSS-36B-Instruct
218
+
219
+ <div align="center">
220
+ <table>
221
+ <thead>
222
+ <tr>
223
+ <th align="center">Benchmark</th>
224
+ <th align="center"><sup><a href="https://console.volcengine.com/ark/region:ark+cn-beijing/model/detail?Id=doubao-seed-1-6-thinking">Seed1.6-Thinking-0715</a></sup></th>
225
+ <th align="center"><sup>OAI-OSS-20B*</sup></th>
226
+ <th align="center"><sup>Qwen3-30B-A3B-Thinking-2507*</sup></th>
227
+ <th align="center"><sup>Qwen3-32B*</sup></th>
228
+ <th align="center"><sup>Gemma3-27B</sup></th>
229
+ <th align="center"><sup>Seed-OSS-36B-Instruct</sup></th>
230
+ </tr>
231
+ </thead>
232
+ <tbody>
233
+ <tr>
234
+ <td align="center" colspan=7><strong>Knowledge</strong></td>
235
+ </tr>
236
+ <tr>
237
+ <td align="center">MMLU-Pro</td>
238
+ <td align="center">86.6</td>
239
+ <td align="center">76.2</td>
240
+ <td align="center"><ins>81.9</ins> (80.9)</td>
241
+ <td align="center">81.8</td>
242
+ <td align="center">67.5</td>
243
+ <td align="center"><b>82.7</b></td>
244
+ </tr>
245
+ <tr>
246
+ <td align="center">MMLU</td>
247
+ <td align="center">90.6</td>
248
+ <td align="center">81.7 (85.3)</td>
249
+ <td align="center"><ins>86.9</ins></td>
250
+ <td align="center">86.2</td>
251
+ <td align="center">76.9</td>
252
+ <td align="center"><b>87.4</b></td>
253
+ </tr>
254
+ <tr>
255
+ <td align="center">GPQA-D</td>
256
+ <td align="center">80.7</td>
257
+ <td align="center"><b>72.2</b> (71.5)</td>
258
+ <td align="center"><ins>71.4</ins> (73.4)</td>
259
+ <td align="center">66.7 (68.4)</td>
260
+ <td align="center">42.4</td>
261
+ <td align="center"><ins>71.4</ins></td>
262
+ </tr>
263
+ <tr>
264
+ <td align="center">SuperGPQA</td>
265
+ <td align="center">63.4</td>
266
+ <td align="center">50.1</td>
267
+ <td align="center"><b>57.3</b> (56.8)</td>
268
+ <td align="center">49.3</td>
269
+ <td align="center">-</td>
270
+ <td align="center"><ins>55.7</ins></td>
271
+ </tr>
272
+ <tr>
273
+ <td align="center">SimpleQA</td>
274
+ <td align="center">23.7</td>
275
+ <td align="center">6.7</td>
276
+ <td align="center"><b>23.6</b></td>
277
+ <td align="center">8.6</td>
278
+ <td align="center"><ins>10</ins></td>
279
+ <td align="center">9.7</td>
280
+ </tr>
281
+
282
+ <tr>
283
+ <td align="center" colspan=7><strong>Math</strong></td>
284
+ </tr>
285
+ <tr>
286
+ <td align="center">AIME24</td>
287
+ <td align="center">90.3</td>
288
+ <td align="center"><b>92.7</b> (92.1)</td>
289
+ <td align="center">87.7</td>
290
+ <td align="center">82.7 (81.4)</td>
291
+ <td align="center">-</td>
292
+ <td align="center"><ins>91.7</ins></td>
293
+ </tr>
294
+ <tr>
295
+ <td align="center">AIME25</td>
296
+ <td align="center">86</td>
297
+ <td align="center"><b>90.3</b> (91.7)</td>
298
+ <td align="center">81.3 (85)</td>
299
+ <td align="center">73.3 (72.9)</td>
300
+ <td align="center">-</td>
301
+ <td align="center"><ins>84.7</ins></td>
302
+ </tr>
303
+ <tr>
304
+ <td align="center">BeyondAIME</td>
305
+ <td align="center">60</td>
306
+ <td align="center"><b>69</b></td>
307
+ <td align="center">56</td>
308
+ <td align="center">29</td>
309
+ <td align="center">-</td>
310
+ <td align="center"><ins>65</ins></td>
311
+ </tr>
312
+
313
+ <tr>
314
+ <td align="center" colspan=7><strong>Reasoning</strong></td>
315
+ </tr>
316
+ <tr>
317
+ <td align="center">ArcAGI V2</td>
318
+ <td align="center">50.3</td>
319
+ <td align="center"><b>41.7</b></td>
320
+ <td align="center">37.8</td>
321
+ <td align="center">14.4</td>
322
+ <td align="center">-</td>
323
+ <td align="center"><ins>40.6</ins></td>
324
+ </tr>
325
+ <tr>
326
+ <td align="center">KORBench</td>
327
+ <td align="center">74.8</td>
328
+ <td align="center"><b>72.3</b></td>
329
+ <td align="center">70.2</td>
330
+ <td align="center">65.4</td>
331
+ <td align="center">-</td>
332
+ <td align="center"><ins>70.6</ins></td>
333
+ </tr>
334
+ <tr>
335
+ <td align="center">HLE</td>
336
+ <td align="center">13.9</td>
337
+ <td align="center"><b>12.7</b> (10.9)</td>
338
+ <td align="center">8.7</td>
339
+ <td align="center">6.9</td>
340
+ <td align="center">-</td>
341
+ <td align="center"><ins>10.1</ins></td>
342
+ </tr>
343
+
344
+ <tr>
345
+ <td align="center" colspan=7><strong>Coding</strong></td>
346
+ </tr>
347
+ <tr>
348
+ <td align="center">LiveCodeBench v6<br/><sup>(02/2025-05/2025)</sup></td>
349
+ <td align="center">66.8</td>
350
+ <td align="center"><ins>63.8</ins></td>
351
+ <td align="center">60.3 (66)</td>
352
+ <td align="center">53.4</td>
353
+ <td align="center">-</td>
354
+ <td align="center"><b>67.4</b></td>
355
+ </tr>
356
+
357
+ <tr>
358
+ <td align="center" colspan=7><strong>Instruction Following</strong></td>
359
+ </tr>
360
+ <tr>
361
+ <td align="center">IFEval</td>
362
+ <td align="center">86.3</td>
363
+ <td align="center"><b>92.8</b></td>
364
+ <td align="center">88 (88.9)</td>
365
+ <td align="center">88.4 (85)</td>
366
+ <td align="center"><ins>90.4</ins></td>
367
+ <td align="center">85.8</td>
368
+ </tr>
369
+
370
+
371
+ <tr>
372
+ <td align="center" colspan=7><strong>Agent</strong></td>
373
+ </tr>
374
+ <tr>
375
+ <td align="center">TAU1-Retail</td>
376
+ <td align="center">63</td>
377
+ <td align="center">(54.8)</td>
378
+ <td align="center"><ins>58.7</ins> (67.8)</td>
379
+ <td align="center">40.9</td>
380
+ <td align="center">-</td>
381
+ <td align="center"><b>70.4</b></td>
382
+ </tr>
383
+ <tr>
384
+ <td align="center">TAU1-Airline</td>
385
+ <td align="center">49</td>
386
+ <td align="center">(38)</td>
387
+ <td align="center"><b>47</b> (48)</td>
388
+ <td align="center">38</td>
389
+ <td align="center">-</td>
390
+ <td align="center"><ins>46</ins></td>
391
+ </tr>
392
+ <tr>
393
+ <td align="center">SWE-Bench Verified<br/><sup>(OpenHands)</sup></td>
394
+ <td align="center">41.8</td>
395
+ <td align="center"><b>(60.7)</b></td>
396
+ <td align="center">31</td>
397
+ <td align="center">23.4</td>
398
+ <td align="center">-</td>
399
+ <td align="center"><ins>56</ins></td>
400
+ </tr>
401
+ <tr>
402
+ <td align="center">SWE-Bench Verified<br/><sup>(AgentLess 4*10)</sup></td>
403
+ <td align="center">48.4</td>
404
+ <td align="center">-</td>
405
+ <td align="center">33.5</td>
406
+ <td align="center"><ins>39.7</ins></td>
407
+ <td align="center">-</td>
408
+ <td align="center"><b>47</b></td>
409
+ </tr>
410
+ <tr>
411
+ <td align="center">Multi-SWE-Bench</td>
412
+ <td align="center">17.7</td>
413
+ <td align="center">-</td>
414
+ <td align="center"><ins>9.5</ins></td>
415
+ <td align="center">7.7</td>
416
+ <td align="center">-</td>
417
+ <td align="center"><b>17</b></td>
418
+ </tr>
419
+
420
+ <tr>
421
+ <td align="center" colspan=7><strong>Multilingualism</strong></td>
422
+ </tr>
423
+ <tr>
424
+ <td align="center">MMMLU</td>
425
+ <td align="center">84.3</td>
426
+ <td align="center">77.4 (75.7)</td>
427
+ <td align="center"><b>79</b></td>
428
+ <td align="center"><b>79</b> (80.6)</td>
429
+ <td align="center">-</td>
430
+ <td align="center"><ins>78.4</ins></td>
431
+ </tr>
432
+
433
+ <tr>
434
+ <td align="center" colspan=7><strong>Long Context</strong></td>
435
+ </tr>
436
+ <tr>
437
+ <td align="center">RULER<br/><sup>(128K)</sup></td>
438
+ <td align="center">94.5</td>
439
+ <td align="center">78.7</td>
440
+ <td align="center"><ins>94.5</ins></td>
441
+ <td align="center">77.5</td>
442
+ <td align="center">-</td>
443
+ <td align="center"><b>94.6</b></td>
444
+ </tr>
445
+
446
+ <tr>
447
+ <td align="center" colspan=7><strong>Safety</strong></td>
448
+ </tr>
449
+ <tr>
450
+ <td align="center">AIR-Bench</td>
451
+ <td align="center">-</td>
452
+ <td align="center">-</td>
453
+ <td align="center">-</td>
454
+ <td align="center">-</td>
455
+ <td align="center">-</td>
456
+ <td align="center">75.6</td>
457
+ </tr>
458
+ </tbody>
459
+ </table>
460
+ </div>
461
+
462
+ <sup>
463
+ - <b>Bold</b> denotes open-source SOTA. <ins>Underlined</ins> indicates the second place in the open-source model.
464
+ </sup><br/><sup>
465
+ - "*" indicates that the results in this column are presented in the format of "reproduced_results (reported_results_if_any)". Some results have been omitted due to the failure of the evaluation run.
466
+ </sup><br/><sup>
467
+ - The results of Gemma3-27B are sourced directly from its technical report.
468
+ </sup><br/><sup>
469
+ - Generation configs for Seed-OSS-36B-Instruct: temperature=1.1, top_p=0.95. Specifically, for Taubench, temperature=1, top_p=0.7.
470
+ </sup><br/><sup>
471
+ </sup>
472
+
473
+ > [!NOTE]
474
+ > We recommend sampling with `temperature=1.1` and `top_p=0.95`.
475
+
476
+ ### Thinking Budget
477
+
478
+ Users can flexibly specify the model's thinking budget. The figure below shows the performance curves across different tasks as the thinking budget varies. For simpler tasks (such as IFEval), the model's chain of thought (CoT) is shorter, and the score exhibits fluctuations as the thinking budget increases. For more challenging tasks (such as AIME and LiveCodeBench), the model's CoT is longer, and the score improves with an increase in the thinking budget.
479
+
480
+ ![thinking_budget](./thinking_budget.png)
481
+
482
+ Here is an example with a thinking budget set to 512: during the reasoning process, the model periodically triggers self-reflection to estimate the consumed and remaining budget, and delivers the final response once the budget is exhausted or the reasoning concludes.
483
+ ```
484
+ <seed:think>
485
+ Got it, let's try to solve this problem step by step. The problem says ... ...
486
+ <seed:cot_budget_reflect>I have used 129 tokens, and there are 383 tokens remaining for use.</seed:cot_budget_reflect>
487
+ Using the power rule, ... ...
488
+ <seed:cot_budget_reflect>I have used 258 tokens, and there are 254 tokens remaining for use.</seed:cot_budget_reflect>
489
+ Alternatively, remember that ... ...
490
+ <seed:cot_budget_reflect>I have used 393 tokens, and there are 119 tokens remaining for use.</seed:cot_budget_reflect>
491
+ Because if ... ...
492
+ <seed:cot_budget_reflect>I have exhausted my token budget, and now I will start answering the question.</seed:cot_budget_reflect>
493
+ </seed:think>
494
+ To solve the problem, we start by using the properties of logarithms to simplify the given equations: (full answer omitted).
495
+ ```
496
+
497
+ If no thinking budget is set (default mode), Seed-OSS will initiate thinking with unlimited length. If a thinking budget is specified, users are advised to prioritize values that are integer multiples of 512 (e.g., 512, 1K, 2K, 4K, 8K, or 16K), as the model has been extensively trained on these intervals. Models are instructed to output a direct response when the thinking budget is 0, and we recommend setting any budget below 512 to this value.
498
+
499
+ ## Quick Start
500
+ ```shell
501
+ pip3 install -r requirements.txt
502
+ pip install git+ssh://[email protected]/Fazziekey/transformers.git@seed-oss
503
+ ```
504
+
505
+ ```python
506
+ from transformers import AutoModelForCausalLM, AutoTokenizer
507
+ import os
508
+ import re
509
+
510
+ model_name_or_path = "ByteDance-Seed/Seed-OSS-36B-Instruct"
511
+
512
+ tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
513
+ model = AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map="auto") # You may want to use bfloat16 and/or move to GPU here
514
+ messages = [
515
+ {"role": "user", "content": "How to make pasta?"},
516
+ ]
517
+ tokenized_chat = tokenizer.apply_chat_template(
518
+ messages,
519
+ tokenize=True,
520
+ add_generation_prompt=True,
521
+ return_tensors="pt",
522
+ thinking_budget=512 # control the thinking budget
523
+ )
524
+
525
+ outputs = model.generate(tokenized_chat.to(model.device), max_new_tokens=2048)
526
+
527
+ output_text = tokenizer.decode(outputs[0])
528
+ ```
529
+
530
+ ## Inference
531
+
532
+ ### Download Model
533
+
534
+ Download Seed-OSS checkpoint to `./Seed-OSS-36B-Instruct`
535
+
536
+ ### Transformers
537
+ The `generate.py` script provides a simple interface for model inference with configurable options.
538
+
539
+ #### Basic Usage
540
+ ```shell
541
+ cd inference
542
+ python3 generate.py --model_path /path/to/model
543
+ ```
544
+
545
+ #### Key Parameters
546
+ | Parameter | Description |
547
+ |-----------|-------------|
548
+ | `--model_path` | Path to the pretrained model directory (required) |
549
+ | `--prompts` | Input prompts (default: sample cooking/code questions) |
550
+ | `--max_new_tokens` | Maximum tokens to generate (default: 4096) |
551
+ | `--attn_implementation` | Attention mechanism: `flash_attention_2` (default) or `eager` |
552
+ | `--load_in_4bit/8bit` | Enable 4-bit/8-bit quantization (reduces memory usage) |
553
+ | `--thinking_budget` | Thinking budget in tokens (default: -1 for unlimited budget) |
554
+
555
+ #### Quantization Examples
556
+ ```shell
557
+ # 8-bit quantization
558
+ python3 generate.py --model_path /path/to/model --load_in_8bit True
559
+
560
+ # 4-bit quantization
561
+ python3 generate.py --model_path /path/to/model --load_in_4bit True
562
+ ```
563
+
564
+ #### Custom Prompts
565
+ ```shell
566
+ python3 generate.py --model_path /path/to/model --prompts "['What is machine learning?', 'Explain quantum computing']"
567
+ ```
568
+
569
+ ### vLLM
570
+ Use vllm >= 0.10.0 or higher for inference.
571
+
572
+ - First install vLLM with Seed-OSS support version:
573
+ ```shell
574
+ VLLM_USE_PRECOMPILED=1 VLLM_TEST_USE_PRECOMPILED_NIGHTLY_WHEEL=1 pip install git+ssh://[email protected]/FoolPlayer/vllm.git@seed-oss
575
+ ```
576
+
577
+ - Start vLLM API server:
578
+ ```shell
579
+ python3 -m vllm.entrypoints.openai.api_server \
580
+ --host localhost \
581
+ --port 4321 \
582
+ --enable-auto-tool-choice \
583
+ --tool-call-parser seed_oss \
584
+ --trust-remote-code \
585
+ --model ./Seed-OSS-36B-Instruct \
586
+ --chat-template ./Seed-OSS-36B-Instruct/chat_template.jinja \
587
+ --tensor-parallel-size 8 \
588
+ --dtype bfloat16 \
589
+ --served-model-name seed_oss
590
+ ```
591
+
592
+ - Test with OpenAI client:
593
+
594
+ Chat
595
+
596
+ ```shell
597
+ # no stream
598
+ python3 inference/vllm_chat.py --max_new_tokens 4096 --thinking_budget -1
599
+ # stream
600
+ python3 inference/vllm_chat.py --max_new_tokens 4096 --thinking_budget -1 --stream
601
+ ```
602
+
603
+ Tool Call
604
+ ```shell
605
+ # no stream
606
+ python3 inference/vllm_tool_call.py --max_new_tokens 4096 --thinking_budget -1
607
+ # stream
608
+ python3 inference/vllm_tool_call.py --max_new_tokens 4096 --thinking_budget -1 --stream
609
+ ```
610
+
611
+
612
+ ## Model Card
613
+ See [MODEL_CARD](./MODEL_CARD.md).
614
+
615
+ ## License
616
+ This project is licensed under Apache-2.0. See the [LICENSE](./LICENSE) flie for details.
617
+
618
+ ## Citation
619
+
620
+ ```bibtex
621
+ @misc{seed2025seed-oss,
622
+ author={ByteDance Seed Team},
623
+ title={Seed-OSS Open-Source Models},
624
+ year={2025},
625
+ howpublished={\url{https://github.com/ByteDance-Seed/seed-oss}}
626
+ }
627
+ ```
628
+
629
+ ## About [ByteDance Seed Team](https://seed.bytedance.com/)
630
+
631
+ Founded in 2023, ByteDance Seed Team is dedicated to crafting the industry's most advanced AI foundation models. The team aspires to become a world-class research team and make significant contributions to the advancement of science and society.
632
+
chat_template.jinja ADDED
@@ -0,0 +1,173 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {# Unsloth Chat template fixes #}
2
+ {# ----------‑‑‑ special token variables ‑‑‑---------- #}
3
+ {%- set bos_token = '<seed:bos>' -%}
4
+ {%- set eos_token = '<seed:eos>' -%}
5
+ {%- set pad_token = '<seed:pad>' -%}
6
+ {%- set toolcall_begin_token = '<seed:tool_call>' -%}
7
+ {%- set toolcall_end_token = '</seed:tool_call>' -%}
8
+ {%- set think_begin_token = '<seed:think>' -%}
9
+ {%- set think_end_token = '</seed:think>' -%}
10
+ {%- set budget_begin_token = '<seed:cot_budget_reflect>'-%}
11
+ {%- set budget_end_token = '</seed:cot_budget_reflect>'-%}
12
+ {# -------------- reflection-interval lookup -------------- #}
13
+ {%- if not thinking_budget is defined %}
14
+ {%- set thinking_budget = -1 -%}
15
+ {%- endif -%}
16
+ {%- set budget_reflections_v05 = {
17
+ 0: 0,
18
+ 512: 128,
19
+ 1024: 256,
20
+ 2048: 512,
21
+ 4096: 512,
22
+ 8192: 1024,
23
+ 16384: 1024
24
+ } -%}
25
+ {# 找到 “大于等于 thinking_budget” 的第一个档位 #}
26
+ {%- set ns = namespace(interval = None) -%}
27
+ {%- for k, v in budget_reflections_v05 | dictsort -%}
28
+ {%- if ns.interval is none and thinking_budget <= k -%}
29
+ {%- set ns.interval = v -%}
30
+ {%- endif -%}
31
+ {%- endfor -%}
32
+ {# 若超过最大档位,则用最后一个档位的值 #}
33
+ {%- if ns.interval is none -%}
34
+ {%- set ns.interval = budget_reflections_v05[16384] -%}
35
+ {%- endif -%}
36
+ {# ---------- 预处理 system 消息 ---------- #}
37
+ {%- if messages[0]["role"] == "system" %}
38
+ {%- set system_message = messages[0]["content"] %}
39
+ {%- set loop_messages = messages[1:] %}
40
+ {%- else %}
41
+ {%- set loop_messages = messages %}
42
+ {%- endif %}
43
+ {# ---------- 确保 tools 存在 ---------- #}
44
+ {%- if not tools is defined or tools is none %}
45
+ {%- set tools = [] %}
46
+ {%- endif %}
47
+ {# tools2doc.jinja #}
48
+ {%- macro py_type(t) -%}
49
+ {%- if t == "string" -%}str
50
+ {%- elif t in ("number", "integer") -%}int
51
+ {%- elif t == "boolean" -%}bool
52
+ {%- elif t == "array" -%}list
53
+ {%- else -%}Any{%- endif -%}
54
+ {%- endmacro -%}
55
+ {# ---------- 输出 system 块 ---------- #}
56
+ {%- if system_message is defined %}
57
+ {{ bos_token + "system\n" + system_message }}
58
+ {%- else %}
59
+ {%- if tools is iterable and tools | length > 0 %}
60
+ {{ bos_token + "system\nYou are Doubao, a helpful AI assistant. You may call one or more functions to assist with the user query." }}
61
+ {%- endif %}
62
+ {%- endif %}
63
+ {%- if use_json_tooldef is defined and use_json_tooldef %}
64
+
65
+ {{"Tool List:\nYou are authorized to use the following tools (described in JSON Schema format). Before performing any task, you must decide how to call them based on the descriptions and parameters of these tools."}}
66
+ {{ tools | tojson|string }}
67
+ {%- else %}
68
+ {%- for item in tools if item.type == "function" %}
69
+
70
+
71
+ Function:
72
+ def {{ item.function.name }}(
73
+ {%- for name, spec in item.function.parameters.properties.items() %}
74
+ {{- name }}: {{ py_type(spec.type) }}{% if not loop.last %},{% endif %}
75
+ {%- endfor %}):
76
+ """
77
+ {{ item.function.description | trim }}
78
+
79
+ {# ---------- Args ---------- #}
80
+ {%- if item.function.parameters.properties %}
81
+ Args:
82
+ {%- for name, spec in item.function.parameters.properties.items() %}
83
+
84
+ - {{ name }} ({{ py_type(spec.type) }})
85
+ {%- if name in item.function.parameters.required %} [必填]{% else %} [选填]{% endif %}:
86
+ {{- " " ~ (spec.description or "") }}
87
+ {%- endfor %}
88
+ {%- endif %}
89
+
90
+ {# ---------- Returns ---------- #}
91
+ {%- if item.function.returns is defined
92
+ and item.function.returns.properties is defined
93
+ and item.function.returns.properties %}
94
+ Returns:
95
+ {%- for name, spec in item.function.returns.properties.items() %}
96
+
97
+ - {{ name }} ({{ py_type(spec.type) }}):
98
+ {{- " " ~ (spec.description or "") }}
99
+ {%- endfor %}
100
+ {%- endif %}
101
+
102
+ """
103
+ {%- endfor %}
104
+ {%- endif %}
105
+ {%- if tools is iterable and tools | length > 0 %}
106
+
107
+ {{"工具调用请遵循如下格式:\n<seed:tool_call>\n<function=example_function_name>\n<parameter=example_parameter_1>value_1</parameter>\n<parameter=example_parameter_2>This is the value for the second parameter\nthat can span\nmultiple lines</parameter>\n</function>\n</seed:tool_call>\n"}}
108
+ {%- endif %}
109
+ {# 结束 system 块行尾 #}
110
+ {%- if system_message is defined or tools is iterable and tools | length > 0 %}
111
+ {{ eos_token }}
112
+ {%- endif %}
113
+ {# ---------- Thinking Budget ---------- #}
114
+ {%- if thinking_budget is defined %}
115
+ {%- if thinking_budget == 0 %}
116
+ {{ bos_token+"system" }}
117
+ {{ "You are an intelligent assistant that can answer questions in one step without the need for reasoning and thinking, that is, your thinking budget is 0. Next, please skip the thinking process and directly start answering the user's questions." }}
118
+ {{ eos_token }}
119
+ {%- elif not thinking_budget == -1 %}
120
+ {{ bos_token+"system" }}
121
+ {{ "You are an intelligent assistant with reflective ability. In the process of thinking and reasoning, you need to strictly follow the thinking budget, which is "}}{{thinking_budget}}{{". That is, you need to complete your thinking within "}}{{thinking_budget}}{{" tokens and start answering the user's questions. You will reflect on your thinking process every "}}{{ns.interval}}{{" tokens, stating how many tokens have been used and how many are left."}}
122
+ {{ eos_token }}
123
+ {%- endif %}
124
+ {%- endif %}
125
+ {# ---------- 逐条写出历史消息 ---------- #}
126
+ {%- for message in loop_messages %}
127
+ {%- if message.role == "assistant"
128
+ and message.tool_calls is defined
129
+ and message.tool_calls is iterable
130
+ and message.tool_calls | length > 0 %}
131
+ {{ bos_token + message.role }}
132
+ {%- if message.reasoning_content is defined and message.reasoning_content is string and message.reasoning_content | trim | length > 0 %}
133
+ {{ "\n" + think_begin_token + message.reasoning_content | trim + think_end_token }}
134
+ {%- endif %}
135
+ {%- if message.content is defined and message.content is string and message.content | trim | length > 0 %}
136
+ {{ "\n" + message.content | trim + "\n" }}
137
+ {%- endif %}
138
+ {%- for tool_call in message.tool_calls %}
139
+ {%- if tool_call.function is defined %}{% set tool_call = tool_call.function %}{% endif %}
140
+ {{ "\n" + toolcall_begin_token + "\n<function=" + tool_call.name + ">\n" }}
141
+ {%- if tool_call.arguments is defined and tool_call.arguments is mapping %}
142
+ {%- for arg_name, arg_value in tool_call.arguments | items %}
143
+ {{ "<parameter=" + arg_name + ">" }}
144
+ {%- set arg_value = arg_value if arg_value is string else arg_value | string %}
145
+ {{ arg_value+"</parameter>\n" }}
146
+ {%- endfor %}
147
+ {%- endif %}
148
+ {{ "</function>\n" + toolcall_end_token }}
149
+ {%- endfor %}
150
+ {{ eos_token }}
151
+ {%- elif message.role in ["user", "system"] %}
152
+ {{ bos_token + message.role + "\n" + message.content + eos_token }}
153
+ {%- elif message.role == "assistant" %}
154
+ {{ bos_token + message.role }}
155
+ {%- if message.reasoning_content is defined and message.reasoning_content is string and message.reasoning_content | trim | length > 0 %}
156
+ {{ "\n" + think_begin_token + message.reasoning_content | trim + think_end_token }}
157
+ {%- endif %}
158
+ {%- if message.content is defined and message.content is string and message.content | trim | length > 0 %}
159
+ {{ "\n" + message.content | trim + eos_token }}
160
+ {%- endif %}
161
+ {# 包括 tool 角色,在这个逻辑 #}
162
+ {%- else %}
163
+ {{ bos_token + message.role + "\n" + message.content + eos_token }}
164
+ {%- endif %}
165
+ {%- endfor %}
166
+ {# ---------- 控制模型开始续写 ---------- #}
167
+ {%- if add_generation_prompt %}
168
+ {{ bos_token+"assistant\n" }}
169
+ {%- if thinking_budget == 0 %}
170
+ {{ think_begin_token+budget_begin_token }}
171
+ {%- endif %}
172
+ {%- endif %}
173
+ {# Copyright 2025-present Unsloth. Apache 2.0 License. #}
config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "SeedOssForCausalLM"
4
+ ],
5
+ "attention_bias": true,
6
+ "attention_dropout": 0.1,
7
+ "attention_out_bias": false,
8
+ "bos_token_id": 0,
9
+ "pad_token_id": 1,
10
+ "eos_token_id": 2,
11
+ "head_dim": 128,
12
+ "hidden_act": "silu",
13
+ "hidden_size": 5120,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 27648,
16
+ "max_position_embeddings": 524288,
17
+ "mlp_bias": false,
18
+ "model_type": "seed_oss",
19
+ "num_attention_heads": 80,
20
+ "num_hidden_layers": 64,
21
+ "num_key_value_heads": 8,
22
+ "residual_dropout": 0.1,
23
+ "rms_norm_eps": 1e-06,
24
+ "rope_scaling": {
25
+ "rope_type": "default"
26
+ },
27
+ "rope_theta": 10000000.0,
28
+ "tie_word_embeddings": false,
29
+ "torch_dtype": "bfloat16",
30
+ "transformers_version": "4.55.0",
31
+ "use_cache": true,
32
+ "vocab_size": 155136
33
+ }
generation_config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 0,
4
+ "pad_token_id": 1,
5
+ "eos_token_id": 2,
6
+ "transformers_version": "4.55.0",
7
+ "temperature": 1.1,
8
+ "top_p": 0.95
9
+ }
10
+
model-00001-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:834c2453087fdbce45b786e250d736727ec5f52d13721577d2bf3517b828ffcc
3
+ size 4954686296
model-00002-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3aa39357bfeb0a0601db504c6ee068f45c580df160f9da98ee9d0ec1815e9576
3
+ size 4991407840
model-00003-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:724dfc23348ea55f35135c582ed4b59b57a021d1448cedb479884a3b3fe89ed5
3
+ size 4834167328
model-00004-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:19752faebe38e21395701abacdaf0f9b06432be692efb60020b28752ef444d15
3
+ size 4886550176
model-00005-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8aa1816ca29458caf3269e5c3d1dd639736f945b4970368c6b7e7d27e52a143d
3
+ size 4834167360
model-00006-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e4ab51a947e938e0e92f2f0062ad652801e0b416a1f3bdb68874c880c3083bc6
3
+ size 4886550176
model-00007-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d68e5465e342cf2c45001685b0623af71f67a13e73fc8417e163ad78da96cdd0
3
+ size 4834167360
model-00008-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6328108446b779b73867a033ba16421974f44d64c1d40d01eba98ac0786a06a
3
+ size 4886550176
model-00009-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7928067d17eb6e3b208317c1bb9c03ae8d0ddf233e846c284ed29896dcabe4d4
3
+ size 4834167360
model-00010-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2aab66b2b8ea7246faae1170444920f487ad45c7e1081a5dc97ec0b80ec2897c
3
+ size 4886550176
model-00011-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:74340de3038f2469c7d7bcfa127f32baaa0f708a00698b6583d013f012feab07
3
+ size 4834167360
model-00012-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7ec20d327302f1a82685a660ee792f6319f861f8099882d3c53b59b5f18e487d
3
+ size 4886550176
model-00013-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b4efea0bf1a26fc0da3eb1fc4c963956fec43dc276270dfa8dd556477f758ce6
3
+ size 4834167360
model-00014-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2b5aeb2cc06fb4b5f06fc7bd88144ceae69f1d754944575e2a0fe177cd9ae45
3
+ size 4886550176
model-00015-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e6b61535f8efaf570381bcb2d389cf660c1bce9f6b8976c3a92a251eac8d0285
3
+ size 4031898896
model.safetensors.index.json ADDED
@@ -0,0 +1,779 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_parameters": 36151104512,
4
+ "total_size": 72302209024
5
+ },
6
+ "weight_map": {
7
+ "lm_head.weight": "model-00015-of-00015.safetensors",
8
+ "model.embed_tokens.weight": "model-00001-of-00015.safetensors",
9
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00015.safetensors",
10
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00015.safetensors",
11
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
12
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00015.safetensors",
13
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00015.safetensors",
14
+ "model.layers.0.self_attn.k_proj.bias": "model-00001-of-00015.safetensors",
15
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
16
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
17
+ "model.layers.0.self_attn.q_proj.bias": "model-00001-of-00015.safetensors",
18
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
19
+ "model.layers.0.self_attn.v_proj.bias": "model-00001-of-00015.safetensors",
20
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
21
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00015.safetensors",
22
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00015.safetensors",
23
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
24
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00015.safetensors",
25
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00015.safetensors",
26
+ "model.layers.1.self_attn.k_proj.bias": "model-00001-of-00015.safetensors",
27
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
28
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
29
+ "model.layers.1.self_attn.q_proj.bias": "model-00001-of-00015.safetensors",
30
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
31
+ "model.layers.1.self_attn.v_proj.bias": "model-00001-of-00015.safetensors",
32
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
33
+ "model.layers.10.input_layernorm.weight": "model-00003-of-00015.safetensors",
34
+ "model.layers.10.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
35
+ "model.layers.10.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
36
+ "model.layers.10.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
37
+ "model.layers.10.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
38
+ "model.layers.10.self_attn.k_proj.bias": "model-00003-of-00015.safetensors",
39
+ "model.layers.10.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
40
+ "model.layers.10.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
41
+ "model.layers.10.self_attn.q_proj.bias": "model-00003-of-00015.safetensors",
42
+ "model.layers.10.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
43
+ "model.layers.10.self_attn.v_proj.bias": "model-00003-of-00015.safetensors",
44
+ "model.layers.10.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
45
+ "model.layers.11.input_layernorm.weight": "model-00003-of-00015.safetensors",
46
+ "model.layers.11.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
47
+ "model.layers.11.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
48
+ "model.layers.11.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
49
+ "model.layers.11.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
50
+ "model.layers.11.self_attn.k_proj.bias": "model-00003-of-00015.safetensors",
51
+ "model.layers.11.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
52
+ "model.layers.11.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
53
+ "model.layers.11.self_attn.q_proj.bias": "model-00003-of-00015.safetensors",
54
+ "model.layers.11.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
55
+ "model.layers.11.self_attn.v_proj.bias": "model-00003-of-00015.safetensors",
56
+ "model.layers.11.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
57
+ "model.layers.12.input_layernorm.weight": "model-00004-of-00015.safetensors",
58
+ "model.layers.12.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
59
+ "model.layers.12.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
60
+ "model.layers.12.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
61
+ "model.layers.12.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
62
+ "model.layers.12.self_attn.k_proj.bias": "model-00003-of-00015.safetensors",
63
+ "model.layers.12.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
64
+ "model.layers.12.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
65
+ "model.layers.12.self_attn.q_proj.bias": "model-00003-of-00015.safetensors",
66
+ "model.layers.12.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
67
+ "model.layers.12.self_attn.v_proj.bias": "model-00003-of-00015.safetensors",
68
+ "model.layers.12.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
69
+ "model.layers.13.input_layernorm.weight": "model-00004-of-00015.safetensors",
70
+ "model.layers.13.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
71
+ "model.layers.13.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
72
+ "model.layers.13.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
73
+ "model.layers.13.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
74
+ "model.layers.13.self_attn.k_proj.bias": "model-00004-of-00015.safetensors",
75
+ "model.layers.13.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
76
+ "model.layers.13.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
77
+ "model.layers.13.self_attn.q_proj.bias": "model-00004-of-00015.safetensors",
78
+ "model.layers.13.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
79
+ "model.layers.13.self_attn.v_proj.bias": "model-00004-of-00015.safetensors",
80
+ "model.layers.13.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
81
+ "model.layers.14.input_layernorm.weight": "model-00004-of-00015.safetensors",
82
+ "model.layers.14.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
83
+ "model.layers.14.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
84
+ "model.layers.14.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
85
+ "model.layers.14.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
86
+ "model.layers.14.self_attn.k_proj.bias": "model-00004-of-00015.safetensors",
87
+ "model.layers.14.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
88
+ "model.layers.14.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
89
+ "model.layers.14.self_attn.q_proj.bias": "model-00004-of-00015.safetensors",
90
+ "model.layers.14.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
91
+ "model.layers.14.self_attn.v_proj.bias": "model-00004-of-00015.safetensors",
92
+ "model.layers.14.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
93
+ "model.layers.15.input_layernorm.weight": "model-00004-of-00015.safetensors",
94
+ "model.layers.15.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
95
+ "model.layers.15.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
96
+ "model.layers.15.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
97
+ "model.layers.15.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
98
+ "model.layers.15.self_attn.k_proj.bias": "model-00004-of-00015.safetensors",
99
+ "model.layers.15.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
100
+ "model.layers.15.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
101
+ "model.layers.15.self_attn.q_proj.bias": "model-00004-of-00015.safetensors",
102
+ "model.layers.15.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
103
+ "model.layers.15.self_attn.v_proj.bias": "model-00004-of-00015.safetensors",
104
+ "model.layers.15.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
105
+ "model.layers.16.input_layernorm.weight": "model-00005-of-00015.safetensors",
106
+ "model.layers.16.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
107
+ "model.layers.16.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
108
+ "model.layers.16.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
109
+ "model.layers.16.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
110
+ "model.layers.16.self_attn.k_proj.bias": "model-00004-of-00015.safetensors",
111
+ "model.layers.16.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
112
+ "model.layers.16.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
113
+ "model.layers.16.self_attn.q_proj.bias": "model-00004-of-00015.safetensors",
114
+ "model.layers.16.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
115
+ "model.layers.16.self_attn.v_proj.bias": "model-00004-of-00015.safetensors",
116
+ "model.layers.16.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
117
+ "model.layers.17.input_layernorm.weight": "model-00005-of-00015.safetensors",
118
+ "model.layers.17.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
119
+ "model.layers.17.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
120
+ "model.layers.17.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
121
+ "model.layers.17.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
122
+ "model.layers.17.self_attn.k_proj.bias": "model-00005-of-00015.safetensors",
123
+ "model.layers.17.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
124
+ "model.layers.17.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
125
+ "model.layers.17.self_attn.q_proj.bias": "model-00005-of-00015.safetensors",
126
+ "model.layers.17.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
127
+ "model.layers.17.self_attn.v_proj.bias": "model-00005-of-00015.safetensors",
128
+ "model.layers.17.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
129
+ "model.layers.18.input_layernorm.weight": "model-00005-of-00015.safetensors",
130
+ "model.layers.18.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
131
+ "model.layers.18.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
132
+ "model.layers.18.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
133
+ "model.layers.18.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
134
+ "model.layers.18.self_attn.k_proj.bias": "model-00005-of-00015.safetensors",
135
+ "model.layers.18.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
136
+ "model.layers.18.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
137
+ "model.layers.18.self_attn.q_proj.bias": "model-00005-of-00015.safetensors",
138
+ "model.layers.18.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
139
+ "model.layers.18.self_attn.v_proj.bias": "model-00005-of-00015.safetensors",
140
+ "model.layers.18.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
141
+ "model.layers.19.input_layernorm.weight": "model-00005-of-00015.safetensors",
142
+ "model.layers.19.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
143
+ "model.layers.19.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
144
+ "model.layers.19.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
145
+ "model.layers.19.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
146
+ "model.layers.19.self_attn.k_proj.bias": "model-00005-of-00015.safetensors",
147
+ "model.layers.19.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
148
+ "model.layers.19.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
149
+ "model.layers.19.self_attn.q_proj.bias": "model-00005-of-00015.safetensors",
150
+ "model.layers.19.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
151
+ "model.layers.19.self_attn.v_proj.bias": "model-00005-of-00015.safetensors",
152
+ "model.layers.19.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
153
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00015.safetensors",
154
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00015.safetensors",
155
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
156
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00015.safetensors",
157
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00015.safetensors",
158
+ "model.layers.2.self_attn.k_proj.bias": "model-00001-of-00015.safetensors",
159
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
160
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
161
+ "model.layers.2.self_attn.q_proj.bias": "model-00001-of-00015.safetensors",
162
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
163
+ "model.layers.2.self_attn.v_proj.bias": "model-00001-of-00015.safetensors",
164
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
165
+ "model.layers.20.input_layernorm.weight": "model-00005-of-00015.safetensors",
166
+ "model.layers.20.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
167
+ "model.layers.20.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
168
+ "model.layers.20.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
169
+ "model.layers.20.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
170
+ "model.layers.20.self_attn.k_proj.bias": "model-00005-of-00015.safetensors",
171
+ "model.layers.20.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
172
+ "model.layers.20.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
173
+ "model.layers.20.self_attn.q_proj.bias": "model-00005-of-00015.safetensors",
174
+ "model.layers.20.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
175
+ "model.layers.20.self_attn.v_proj.bias": "model-00005-of-00015.safetensors",
176
+ "model.layers.20.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
177
+ "model.layers.21.input_layernorm.weight": "model-00006-of-00015.safetensors",
178
+ "model.layers.21.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
179
+ "model.layers.21.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
180
+ "model.layers.21.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
181
+ "model.layers.21.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
182
+ "model.layers.21.self_attn.k_proj.bias": "model-00005-of-00015.safetensors",
183
+ "model.layers.21.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
184
+ "model.layers.21.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
185
+ "model.layers.21.self_attn.q_proj.bias": "model-00005-of-00015.safetensors",
186
+ "model.layers.21.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
187
+ "model.layers.21.self_attn.v_proj.bias": "model-00005-of-00015.safetensors",
188
+ "model.layers.21.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
189
+ "model.layers.22.input_layernorm.weight": "model-00006-of-00015.safetensors",
190
+ "model.layers.22.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
191
+ "model.layers.22.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
192
+ "model.layers.22.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
193
+ "model.layers.22.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
194
+ "model.layers.22.self_attn.k_proj.bias": "model-00006-of-00015.safetensors",
195
+ "model.layers.22.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
196
+ "model.layers.22.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
197
+ "model.layers.22.self_attn.q_proj.bias": "model-00006-of-00015.safetensors",
198
+ "model.layers.22.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
199
+ "model.layers.22.self_attn.v_proj.bias": "model-00006-of-00015.safetensors",
200
+ "model.layers.22.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
201
+ "model.layers.23.input_layernorm.weight": "model-00006-of-00015.safetensors",
202
+ "model.layers.23.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
203
+ "model.layers.23.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
204
+ "model.layers.23.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
205
+ "model.layers.23.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
206
+ "model.layers.23.self_attn.k_proj.bias": "model-00006-of-00015.safetensors",
207
+ "model.layers.23.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
208
+ "model.layers.23.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
209
+ "model.layers.23.self_attn.q_proj.bias": "model-00006-of-00015.safetensors",
210
+ "model.layers.23.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
211
+ "model.layers.23.self_attn.v_proj.bias": "model-00006-of-00015.safetensors",
212
+ "model.layers.23.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
213
+ "model.layers.24.input_layernorm.weight": "model-00006-of-00015.safetensors",
214
+ "model.layers.24.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
215
+ "model.layers.24.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
216
+ "model.layers.24.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
217
+ "model.layers.24.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
218
+ "model.layers.24.self_attn.k_proj.bias": "model-00006-of-00015.safetensors",
219
+ "model.layers.24.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
220
+ "model.layers.24.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
221
+ "model.layers.24.self_attn.q_proj.bias": "model-00006-of-00015.safetensors",
222
+ "model.layers.24.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
223
+ "model.layers.24.self_attn.v_proj.bias": "model-00006-of-00015.safetensors",
224
+ "model.layers.24.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
225
+ "model.layers.25.input_layernorm.weight": "model-00007-of-00015.safetensors",
226
+ "model.layers.25.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
227
+ "model.layers.25.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
228
+ "model.layers.25.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
229
+ "model.layers.25.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
230
+ "model.layers.25.self_attn.k_proj.bias": "model-00006-of-00015.safetensors",
231
+ "model.layers.25.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
232
+ "model.layers.25.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
233
+ "model.layers.25.self_attn.q_proj.bias": "model-00006-of-00015.safetensors",
234
+ "model.layers.25.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
235
+ "model.layers.25.self_attn.v_proj.bias": "model-00006-of-00015.safetensors",
236
+ "model.layers.25.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
237
+ "model.layers.26.input_layernorm.weight": "model-00007-of-00015.safetensors",
238
+ "model.layers.26.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
239
+ "model.layers.26.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
240
+ "model.layers.26.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
241
+ "model.layers.26.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
242
+ "model.layers.26.self_attn.k_proj.bias": "model-00007-of-00015.safetensors",
243
+ "model.layers.26.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
244
+ "model.layers.26.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
245
+ "model.layers.26.self_attn.q_proj.bias": "model-00007-of-00015.safetensors",
246
+ "model.layers.26.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
247
+ "model.layers.26.self_attn.v_proj.bias": "model-00007-of-00015.safetensors",
248
+ "model.layers.26.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
249
+ "model.layers.27.input_layernorm.weight": "model-00007-of-00015.safetensors",
250
+ "model.layers.27.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
251
+ "model.layers.27.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
252
+ "model.layers.27.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
253
+ "model.layers.27.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
254
+ "model.layers.27.self_attn.k_proj.bias": "model-00007-of-00015.safetensors",
255
+ "model.layers.27.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
256
+ "model.layers.27.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
257
+ "model.layers.27.self_attn.q_proj.bias": "model-00007-of-00015.safetensors",
258
+ "model.layers.27.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
259
+ "model.layers.27.self_attn.v_proj.bias": "model-00007-of-00015.safetensors",
260
+ "model.layers.27.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
261
+ "model.layers.28.input_layernorm.weight": "model-00007-of-00015.safetensors",
262
+ "model.layers.28.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
263
+ "model.layers.28.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
264
+ "model.layers.28.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
265
+ "model.layers.28.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
266
+ "model.layers.28.self_attn.k_proj.bias": "model-00007-of-00015.safetensors",
267
+ "model.layers.28.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
268
+ "model.layers.28.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
269
+ "model.layers.28.self_attn.q_proj.bias": "model-00007-of-00015.safetensors",
270
+ "model.layers.28.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
271
+ "model.layers.28.self_attn.v_proj.bias": "model-00007-of-00015.safetensors",
272
+ "model.layers.28.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
273
+ "model.layers.29.input_layernorm.weight": "model-00007-of-00015.safetensors",
274
+ "model.layers.29.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
275
+ "model.layers.29.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
276
+ "model.layers.29.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
277
+ "model.layers.29.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
278
+ "model.layers.29.self_attn.k_proj.bias": "model-00007-of-00015.safetensors",
279
+ "model.layers.29.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
280
+ "model.layers.29.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
281
+ "model.layers.29.self_attn.q_proj.bias": "model-00007-of-00015.safetensors",
282
+ "model.layers.29.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
283
+ "model.layers.29.self_attn.v_proj.bias": "model-00007-of-00015.safetensors",
284
+ "model.layers.29.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
285
+ "model.layers.3.input_layernorm.weight": "model-00002-of-00015.safetensors",
286
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
287
+ "model.layers.3.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
288
+ "model.layers.3.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
289
+ "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
290
+ "model.layers.3.self_attn.k_proj.bias": "model-00001-of-00015.safetensors",
291
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
292
+ "model.layers.3.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
293
+ "model.layers.3.self_attn.q_proj.bias": "model-00001-of-00015.safetensors",
294
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
295
+ "model.layers.3.self_attn.v_proj.bias": "model-00001-of-00015.safetensors",
296
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
297
+ "model.layers.30.input_layernorm.weight": "model-00008-of-00015.safetensors",
298
+ "model.layers.30.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
299
+ "model.layers.30.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
300
+ "model.layers.30.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
301
+ "model.layers.30.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
302
+ "model.layers.30.self_attn.k_proj.bias": "model-00007-of-00015.safetensors",
303
+ "model.layers.30.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
304
+ "model.layers.30.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
305
+ "model.layers.30.self_attn.q_proj.bias": "model-00007-of-00015.safetensors",
306
+ "model.layers.30.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
307
+ "model.layers.30.self_attn.v_proj.bias": "model-00007-of-00015.safetensors",
308
+ "model.layers.30.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
309
+ "model.layers.31.input_layernorm.weight": "model-00008-of-00015.safetensors",
310
+ "model.layers.31.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
311
+ "model.layers.31.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
312
+ "model.layers.31.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
313
+ "model.layers.31.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
314
+ "model.layers.31.self_attn.k_proj.bias": "model-00008-of-00015.safetensors",
315
+ "model.layers.31.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
316
+ "model.layers.31.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
317
+ "model.layers.31.self_attn.q_proj.bias": "model-00008-of-00015.safetensors",
318
+ "model.layers.31.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
319
+ "model.layers.31.self_attn.v_proj.bias": "model-00008-of-00015.safetensors",
320
+ "model.layers.31.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
321
+ "model.layers.32.input_layernorm.weight": "model-00008-of-00015.safetensors",
322
+ "model.layers.32.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
323
+ "model.layers.32.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
324
+ "model.layers.32.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
325
+ "model.layers.32.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
326
+ "model.layers.32.self_attn.k_proj.bias": "model-00008-of-00015.safetensors",
327
+ "model.layers.32.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
328
+ "model.layers.32.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
329
+ "model.layers.32.self_attn.q_proj.bias": "model-00008-of-00015.safetensors",
330
+ "model.layers.32.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
331
+ "model.layers.32.self_attn.v_proj.bias": "model-00008-of-00015.safetensors",
332
+ "model.layers.32.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
333
+ "model.layers.33.input_layernorm.weight": "model-00008-of-00015.safetensors",
334
+ "model.layers.33.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
335
+ "model.layers.33.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
336
+ "model.layers.33.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
337
+ "model.layers.33.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
338
+ "model.layers.33.self_attn.k_proj.bias": "model-00008-of-00015.safetensors",
339
+ "model.layers.33.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
340
+ "model.layers.33.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
341
+ "model.layers.33.self_attn.q_proj.bias": "model-00008-of-00015.safetensors",
342
+ "model.layers.33.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
343
+ "model.layers.33.self_attn.v_proj.bias": "model-00008-of-00015.safetensors",
344
+ "model.layers.33.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
345
+ "model.layers.34.input_layernorm.weight": "model-00009-of-00015.safetensors",
346
+ "model.layers.34.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
347
+ "model.layers.34.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
348
+ "model.layers.34.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
349
+ "model.layers.34.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
350
+ "model.layers.34.self_attn.k_proj.bias": "model-00008-of-00015.safetensors",
351
+ "model.layers.34.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
352
+ "model.layers.34.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
353
+ "model.layers.34.self_attn.q_proj.bias": "model-00008-of-00015.safetensors",
354
+ "model.layers.34.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
355
+ "model.layers.34.self_attn.v_proj.bias": "model-00008-of-00015.safetensors",
356
+ "model.layers.34.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
357
+ "model.layers.35.input_layernorm.weight": "model-00009-of-00015.safetensors",
358
+ "model.layers.35.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
359
+ "model.layers.35.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
360
+ "model.layers.35.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
361
+ "model.layers.35.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
362
+ "model.layers.35.self_attn.k_proj.bias": "model-00009-of-00015.safetensors",
363
+ "model.layers.35.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
364
+ "model.layers.35.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
365
+ "model.layers.35.self_attn.q_proj.bias": "model-00009-of-00015.safetensors",
366
+ "model.layers.35.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
367
+ "model.layers.35.self_attn.v_proj.bias": "model-00009-of-00015.safetensors",
368
+ "model.layers.35.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
369
+ "model.layers.36.input_layernorm.weight": "model-00009-of-00015.safetensors",
370
+ "model.layers.36.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
371
+ "model.layers.36.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
372
+ "model.layers.36.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
373
+ "model.layers.36.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
374
+ "model.layers.36.self_attn.k_proj.bias": "model-00009-of-00015.safetensors",
375
+ "model.layers.36.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
376
+ "model.layers.36.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
377
+ "model.layers.36.self_attn.q_proj.bias": "model-00009-of-00015.safetensors",
378
+ "model.layers.36.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
379
+ "model.layers.36.self_attn.v_proj.bias": "model-00009-of-00015.safetensors",
380
+ "model.layers.36.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
381
+ "model.layers.37.input_layernorm.weight": "model-00009-of-00015.safetensors",
382
+ "model.layers.37.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
383
+ "model.layers.37.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
384
+ "model.layers.37.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
385
+ "model.layers.37.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
386
+ "model.layers.37.self_attn.k_proj.bias": "model-00009-of-00015.safetensors",
387
+ "model.layers.37.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
388
+ "model.layers.37.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
389
+ "model.layers.37.self_attn.q_proj.bias": "model-00009-of-00015.safetensors",
390
+ "model.layers.37.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
391
+ "model.layers.37.self_attn.v_proj.bias": "model-00009-of-00015.safetensors",
392
+ "model.layers.37.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
393
+ "model.layers.38.input_layernorm.weight": "model-00009-of-00015.safetensors",
394
+ "model.layers.38.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
395
+ "model.layers.38.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
396
+ "model.layers.38.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
397
+ "model.layers.38.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
398
+ "model.layers.38.self_attn.k_proj.bias": "model-00009-of-00015.safetensors",
399
+ "model.layers.38.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
400
+ "model.layers.38.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
401
+ "model.layers.38.self_attn.q_proj.bias": "model-00009-of-00015.safetensors",
402
+ "model.layers.38.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
403
+ "model.layers.38.self_attn.v_proj.bias": "model-00009-of-00015.safetensors",
404
+ "model.layers.38.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
405
+ "model.layers.39.input_layernorm.weight": "model-00010-of-00015.safetensors",
406
+ "model.layers.39.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
407
+ "model.layers.39.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
408
+ "model.layers.39.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
409
+ "model.layers.39.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
410
+ "model.layers.39.self_attn.k_proj.bias": "model-00009-of-00015.safetensors",
411
+ "model.layers.39.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
412
+ "model.layers.39.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
413
+ "model.layers.39.self_attn.q_proj.bias": "model-00009-of-00015.safetensors",
414
+ "model.layers.39.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
415
+ "model.layers.39.self_attn.v_proj.bias": "model-00009-of-00015.safetensors",
416
+ "model.layers.39.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
417
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00015.safetensors",
418
+ "model.layers.4.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
419
+ "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
420
+ "model.layers.4.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
421
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
422
+ "model.layers.4.self_attn.k_proj.bias": "model-00002-of-00015.safetensors",
423
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
424
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
425
+ "model.layers.4.self_attn.q_proj.bias": "model-00002-of-00015.safetensors",
426
+ "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
427
+ "model.layers.4.self_attn.v_proj.bias": "model-00002-of-00015.safetensors",
428
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
429
+ "model.layers.40.input_layernorm.weight": "model-00010-of-00015.safetensors",
430
+ "model.layers.40.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
431
+ "model.layers.40.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
432
+ "model.layers.40.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
433
+ "model.layers.40.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
434
+ "model.layers.40.self_attn.k_proj.bias": "model-00010-of-00015.safetensors",
435
+ "model.layers.40.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
436
+ "model.layers.40.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
437
+ "model.layers.40.self_attn.q_proj.bias": "model-00010-of-00015.safetensors",
438
+ "model.layers.40.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
439
+ "model.layers.40.self_attn.v_proj.bias": "model-00010-of-00015.safetensors",
440
+ "model.layers.40.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
441
+ "model.layers.41.input_layernorm.weight": "model-00010-of-00015.safetensors",
442
+ "model.layers.41.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
443
+ "model.layers.41.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
444
+ "model.layers.41.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
445
+ "model.layers.41.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
446
+ "model.layers.41.self_attn.k_proj.bias": "model-00010-of-00015.safetensors",
447
+ "model.layers.41.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
448
+ "model.layers.41.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
449
+ "model.layers.41.self_attn.q_proj.bias": "model-00010-of-00015.safetensors",
450
+ "model.layers.41.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
451
+ "model.layers.41.self_attn.v_proj.bias": "model-00010-of-00015.safetensors",
452
+ "model.layers.41.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
453
+ "model.layers.42.input_layernorm.weight": "model-00010-of-00015.safetensors",
454
+ "model.layers.42.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
455
+ "model.layers.42.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
456
+ "model.layers.42.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
457
+ "model.layers.42.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
458
+ "model.layers.42.self_attn.k_proj.bias": "model-00010-of-00015.safetensors",
459
+ "model.layers.42.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
460
+ "model.layers.42.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
461
+ "model.layers.42.self_attn.q_proj.bias": "model-00010-of-00015.safetensors",
462
+ "model.layers.42.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
463
+ "model.layers.42.self_attn.v_proj.bias": "model-00010-of-00015.safetensors",
464
+ "model.layers.42.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
465
+ "model.layers.43.input_layernorm.weight": "model-00011-of-00015.safetensors",
466
+ "model.layers.43.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
467
+ "model.layers.43.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
468
+ "model.layers.43.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
469
+ "model.layers.43.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
470
+ "model.layers.43.self_attn.k_proj.bias": "model-00010-of-00015.safetensors",
471
+ "model.layers.43.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
472
+ "model.layers.43.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
473
+ "model.layers.43.self_attn.q_proj.bias": "model-00010-of-00015.safetensors",
474
+ "model.layers.43.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
475
+ "model.layers.43.self_attn.v_proj.bias": "model-00010-of-00015.safetensors",
476
+ "model.layers.43.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
477
+ "model.layers.44.input_layernorm.weight": "model-00011-of-00015.safetensors",
478
+ "model.layers.44.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
479
+ "model.layers.44.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
480
+ "model.layers.44.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
481
+ "model.layers.44.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
482
+ "model.layers.44.self_attn.k_proj.bias": "model-00011-of-00015.safetensors",
483
+ "model.layers.44.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
484
+ "model.layers.44.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
485
+ "model.layers.44.self_attn.q_proj.bias": "model-00011-of-00015.safetensors",
486
+ "model.layers.44.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
487
+ "model.layers.44.self_attn.v_proj.bias": "model-00011-of-00015.safetensors",
488
+ "model.layers.44.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
489
+ "model.layers.45.input_layernorm.weight": "model-00011-of-00015.safetensors",
490
+ "model.layers.45.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
491
+ "model.layers.45.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
492
+ "model.layers.45.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
493
+ "model.layers.45.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
494
+ "model.layers.45.self_attn.k_proj.bias": "model-00011-of-00015.safetensors",
495
+ "model.layers.45.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
496
+ "model.layers.45.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
497
+ "model.layers.45.self_attn.q_proj.bias": "model-00011-of-00015.safetensors",
498
+ "model.layers.45.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
499
+ "model.layers.45.self_attn.v_proj.bias": "model-00011-of-00015.safetensors",
500
+ "model.layers.45.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
501
+ "model.layers.46.input_layernorm.weight": "model-00011-of-00015.safetensors",
502
+ "model.layers.46.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
503
+ "model.layers.46.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
504
+ "model.layers.46.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
505
+ "model.layers.46.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
506
+ "model.layers.46.self_attn.k_proj.bias": "model-00011-of-00015.safetensors",
507
+ "model.layers.46.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
508
+ "model.layers.46.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
509
+ "model.layers.46.self_attn.q_proj.bias": "model-00011-of-00015.safetensors",
510
+ "model.layers.46.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
511
+ "model.layers.46.self_attn.v_proj.bias": "model-00011-of-00015.safetensors",
512
+ "model.layers.46.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
513
+ "model.layers.47.input_layernorm.weight": "model-00011-of-00015.safetensors",
514
+ "model.layers.47.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
515
+ "model.layers.47.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
516
+ "model.layers.47.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
517
+ "model.layers.47.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
518
+ "model.layers.47.self_attn.k_proj.bias": "model-00011-of-00015.safetensors",
519
+ "model.layers.47.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
520
+ "model.layers.47.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
521
+ "model.layers.47.self_attn.q_proj.bias": "model-00011-of-00015.safetensors",
522
+ "model.layers.47.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
523
+ "model.layers.47.self_attn.v_proj.bias": "model-00011-of-00015.safetensors",
524
+ "model.layers.47.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
525
+ "model.layers.48.input_layernorm.weight": "model-00012-of-00015.safetensors",
526
+ "model.layers.48.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
527
+ "model.layers.48.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
528
+ "model.layers.48.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
529
+ "model.layers.48.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
530
+ "model.layers.48.self_attn.k_proj.bias": "model-00011-of-00015.safetensors",
531
+ "model.layers.48.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
532
+ "model.layers.48.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
533
+ "model.layers.48.self_attn.q_proj.bias": "model-00011-of-00015.safetensors",
534
+ "model.layers.48.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
535
+ "model.layers.48.self_attn.v_proj.bias": "model-00011-of-00015.safetensors",
536
+ "model.layers.48.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
537
+ "model.layers.49.input_layernorm.weight": "model-00012-of-00015.safetensors",
538
+ "model.layers.49.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
539
+ "model.layers.49.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
540
+ "model.layers.49.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
541
+ "model.layers.49.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
542
+ "model.layers.49.self_attn.k_proj.bias": "model-00012-of-00015.safetensors",
543
+ "model.layers.49.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
544
+ "model.layers.49.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
545
+ "model.layers.49.self_attn.q_proj.bias": "model-00012-of-00015.safetensors",
546
+ "model.layers.49.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
547
+ "model.layers.49.self_attn.v_proj.bias": "model-00012-of-00015.safetensors",
548
+ "model.layers.49.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
549
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00015.safetensors",
550
+ "model.layers.5.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
551
+ "model.layers.5.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
552
+ "model.layers.5.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
553
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
554
+ "model.layers.5.self_attn.k_proj.bias": "model-00002-of-00015.safetensors",
555
+ "model.layers.5.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
556
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
557
+ "model.layers.5.self_attn.q_proj.bias": "model-00002-of-00015.safetensors",
558
+ "model.layers.5.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
559
+ "model.layers.5.self_attn.v_proj.bias": "model-00002-of-00015.safetensors",
560
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
561
+ "model.layers.50.input_layernorm.weight": "model-00012-of-00015.safetensors",
562
+ "model.layers.50.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
563
+ "model.layers.50.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
564
+ "model.layers.50.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
565
+ "model.layers.50.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
566
+ "model.layers.50.self_attn.k_proj.bias": "model-00012-of-00015.safetensors",
567
+ "model.layers.50.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
568
+ "model.layers.50.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
569
+ "model.layers.50.self_attn.q_proj.bias": "model-00012-of-00015.safetensors",
570
+ "model.layers.50.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
571
+ "model.layers.50.self_attn.v_proj.bias": "model-00012-of-00015.safetensors",
572
+ "model.layers.50.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
573
+ "model.layers.51.input_layernorm.weight": "model-00012-of-00015.safetensors",
574
+ "model.layers.51.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
575
+ "model.layers.51.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
576
+ "model.layers.51.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
577
+ "model.layers.51.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
578
+ "model.layers.51.self_attn.k_proj.bias": "model-00012-of-00015.safetensors",
579
+ "model.layers.51.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
580
+ "model.layers.51.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
581
+ "model.layers.51.self_attn.q_proj.bias": "model-00012-of-00015.safetensors",
582
+ "model.layers.51.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
583
+ "model.layers.51.self_attn.v_proj.bias": "model-00012-of-00015.safetensors",
584
+ "model.layers.51.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
585
+ "model.layers.52.input_layernorm.weight": "model-00013-of-00015.safetensors",
586
+ "model.layers.52.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
587
+ "model.layers.52.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
588
+ "model.layers.52.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
589
+ "model.layers.52.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
590
+ "model.layers.52.self_attn.k_proj.bias": "model-00012-of-00015.safetensors",
591
+ "model.layers.52.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
592
+ "model.layers.52.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
593
+ "model.layers.52.self_attn.q_proj.bias": "model-00012-of-00015.safetensors",
594
+ "model.layers.52.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
595
+ "model.layers.52.self_attn.v_proj.bias": "model-00012-of-00015.safetensors",
596
+ "model.layers.52.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
597
+ "model.layers.53.input_layernorm.weight": "model-00013-of-00015.safetensors",
598
+ "model.layers.53.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
599
+ "model.layers.53.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
600
+ "model.layers.53.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
601
+ "model.layers.53.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
602
+ "model.layers.53.self_attn.k_proj.bias": "model-00013-of-00015.safetensors",
603
+ "model.layers.53.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
604
+ "model.layers.53.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
605
+ "model.layers.53.self_attn.q_proj.bias": "model-00013-of-00015.safetensors",
606
+ "model.layers.53.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
607
+ "model.layers.53.self_attn.v_proj.bias": "model-00013-of-00015.safetensors",
608
+ "model.layers.53.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
609
+ "model.layers.54.input_layernorm.weight": "model-00013-of-00015.safetensors",
610
+ "model.layers.54.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
611
+ "model.layers.54.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
612
+ "model.layers.54.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
613
+ "model.layers.54.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
614
+ "model.layers.54.self_attn.k_proj.bias": "model-00013-of-00015.safetensors",
615
+ "model.layers.54.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
616
+ "model.layers.54.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
617
+ "model.layers.54.self_attn.q_proj.bias": "model-00013-of-00015.safetensors",
618
+ "model.layers.54.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
619
+ "model.layers.54.self_attn.v_proj.bias": "model-00013-of-00015.safetensors",
620
+ "model.layers.54.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
621
+ "model.layers.55.input_layernorm.weight": "model-00013-of-00015.safetensors",
622
+ "model.layers.55.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
623
+ "model.layers.55.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
624
+ "model.layers.55.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
625
+ "model.layers.55.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
626
+ "model.layers.55.self_attn.k_proj.bias": "model-00013-of-00015.safetensors",
627
+ "model.layers.55.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
628
+ "model.layers.55.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
629
+ "model.layers.55.self_attn.q_proj.bias": "model-00013-of-00015.safetensors",
630
+ "model.layers.55.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
631
+ "model.layers.55.self_attn.v_proj.bias": "model-00013-of-00015.safetensors",
632
+ "model.layers.55.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
633
+ "model.layers.56.input_layernorm.weight": "model-00013-of-00015.safetensors",
634
+ "model.layers.56.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
635
+ "model.layers.56.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
636
+ "model.layers.56.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
637
+ "model.layers.56.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
638
+ "model.layers.56.self_attn.k_proj.bias": "model-00013-of-00015.safetensors",
639
+ "model.layers.56.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
640
+ "model.layers.56.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
641
+ "model.layers.56.self_attn.q_proj.bias": "model-00013-of-00015.safetensors",
642
+ "model.layers.56.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
643
+ "model.layers.56.self_attn.v_proj.bias": "model-00013-of-00015.safetensors",
644
+ "model.layers.56.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
645
+ "model.layers.57.input_layernorm.weight": "model-00014-of-00015.safetensors",
646
+ "model.layers.57.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
647
+ "model.layers.57.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
648
+ "model.layers.57.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
649
+ "model.layers.57.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
650
+ "model.layers.57.self_attn.k_proj.bias": "model-00013-of-00015.safetensors",
651
+ "model.layers.57.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
652
+ "model.layers.57.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
653
+ "model.layers.57.self_attn.q_proj.bias": "model-00013-of-00015.safetensors",
654
+ "model.layers.57.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
655
+ "model.layers.57.self_attn.v_proj.bias": "model-00013-of-00015.safetensors",
656
+ "model.layers.57.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
657
+ "model.layers.58.input_layernorm.weight": "model-00014-of-00015.safetensors",
658
+ "model.layers.58.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
659
+ "model.layers.58.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
660
+ "model.layers.58.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
661
+ "model.layers.58.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
662
+ "model.layers.58.self_attn.k_proj.bias": "model-00014-of-00015.safetensors",
663
+ "model.layers.58.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
664
+ "model.layers.58.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
665
+ "model.layers.58.self_attn.q_proj.bias": "model-00014-of-00015.safetensors",
666
+ "model.layers.58.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
667
+ "model.layers.58.self_attn.v_proj.bias": "model-00014-of-00015.safetensors",
668
+ "model.layers.58.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
669
+ "model.layers.59.input_layernorm.weight": "model-00014-of-00015.safetensors",
670
+ "model.layers.59.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
671
+ "model.layers.59.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
672
+ "model.layers.59.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
673
+ "model.layers.59.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
674
+ "model.layers.59.self_attn.k_proj.bias": "model-00014-of-00015.safetensors",
675
+ "model.layers.59.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
676
+ "model.layers.59.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
677
+ "model.layers.59.self_attn.q_proj.bias": "model-00014-of-00015.safetensors",
678
+ "model.layers.59.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
679
+ "model.layers.59.self_attn.v_proj.bias": "model-00014-of-00015.safetensors",
680
+ "model.layers.59.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
681
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00015.safetensors",
682
+ "model.layers.6.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
683
+ "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
684
+ "model.layers.6.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
685
+ "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
686
+ "model.layers.6.self_attn.k_proj.bias": "model-00002-of-00015.safetensors",
687
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
688
+ "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
689
+ "model.layers.6.self_attn.q_proj.bias": "model-00002-of-00015.safetensors",
690
+ "model.layers.6.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
691
+ "model.layers.6.self_attn.v_proj.bias": "model-00002-of-00015.safetensors",
692
+ "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
693
+ "model.layers.60.input_layernorm.weight": "model-00014-of-00015.safetensors",
694
+ "model.layers.60.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
695
+ "model.layers.60.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
696
+ "model.layers.60.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
697
+ "model.layers.60.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
698
+ "model.layers.60.self_attn.k_proj.bias": "model-00014-of-00015.safetensors",
699
+ "model.layers.60.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
700
+ "model.layers.60.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
701
+ "model.layers.60.self_attn.q_proj.bias": "model-00014-of-00015.safetensors",
702
+ "model.layers.60.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
703
+ "model.layers.60.self_attn.v_proj.bias": "model-00014-of-00015.safetensors",
704
+ "model.layers.60.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
705
+ "model.layers.61.input_layernorm.weight": "model-00015-of-00015.safetensors",
706
+ "model.layers.61.mlp.down_proj.weight": "model-00015-of-00015.safetensors",
707
+ "model.layers.61.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
708
+ "model.layers.61.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
709
+ "model.layers.61.post_attention_layernorm.weight": "model-00015-of-00015.safetensors",
710
+ "model.layers.61.self_attn.k_proj.bias": "model-00014-of-00015.safetensors",
711
+ "model.layers.61.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
712
+ "model.layers.61.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
713
+ "model.layers.61.self_attn.q_proj.bias": "model-00014-of-00015.safetensors",
714
+ "model.layers.61.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
715
+ "model.layers.61.self_attn.v_proj.bias": "model-00014-of-00015.safetensors",
716
+ "model.layers.61.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
717
+ "model.layers.62.input_layernorm.weight": "model-00015-of-00015.safetensors",
718
+ "model.layers.62.mlp.down_proj.weight": "model-00015-of-00015.safetensors",
719
+ "model.layers.62.mlp.gate_proj.weight": "model-00015-of-00015.safetensors",
720
+ "model.layers.62.mlp.up_proj.weight": "model-00015-of-00015.safetensors",
721
+ "model.layers.62.post_attention_layernorm.weight": "model-00015-of-00015.safetensors",
722
+ "model.layers.62.self_attn.k_proj.bias": "model-00015-of-00015.safetensors",
723
+ "model.layers.62.self_attn.k_proj.weight": "model-00015-of-00015.safetensors",
724
+ "model.layers.62.self_attn.o_proj.weight": "model-00015-of-00015.safetensors",
725
+ "model.layers.62.self_attn.q_proj.bias": "model-00015-of-00015.safetensors",
726
+ "model.layers.62.self_attn.q_proj.weight": "model-00015-of-00015.safetensors",
727
+ "model.layers.62.self_attn.v_proj.bias": "model-00015-of-00015.safetensors",
728
+ "model.layers.62.self_attn.v_proj.weight": "model-00015-of-00015.safetensors",
729
+ "model.layers.63.input_layernorm.weight": "model-00015-of-00015.safetensors",
730
+ "model.layers.63.mlp.down_proj.weight": "model-00015-of-00015.safetensors",
731
+ "model.layers.63.mlp.gate_proj.weight": "model-00015-of-00015.safetensors",
732
+ "model.layers.63.mlp.up_proj.weight": "model-00015-of-00015.safetensors",
733
+ "model.layers.63.post_attention_layernorm.weight": "model-00015-of-00015.safetensors",
734
+ "model.layers.63.self_attn.k_proj.bias": "model-00015-of-00015.safetensors",
735
+ "model.layers.63.self_attn.k_proj.weight": "model-00015-of-00015.safetensors",
736
+ "model.layers.63.self_attn.o_proj.weight": "model-00015-of-00015.safetensors",
737
+ "model.layers.63.self_attn.q_proj.bias": "model-00015-of-00015.safetensors",
738
+ "model.layers.63.self_attn.q_proj.weight": "model-00015-of-00015.safetensors",
739
+ "model.layers.63.self_attn.v_proj.bias": "model-00015-of-00015.safetensors",
740
+ "model.layers.63.self_attn.v_proj.weight": "model-00015-of-00015.safetensors",
741
+ "model.layers.7.input_layernorm.weight": "model-00003-of-00015.safetensors",
742
+ "model.layers.7.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
743
+ "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
744
+ "model.layers.7.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
745
+ "model.layers.7.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
746
+ "model.layers.7.self_attn.k_proj.bias": "model-00002-of-00015.safetensors",
747
+ "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
748
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
749
+ "model.layers.7.self_attn.q_proj.bias": "model-00002-of-00015.safetensors",
750
+ "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
751
+ "model.layers.7.self_attn.v_proj.bias": "model-00002-of-00015.safetensors",
752
+ "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
753
+ "model.layers.8.input_layernorm.weight": "model-00003-of-00015.safetensors",
754
+ "model.layers.8.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
755
+ "model.layers.8.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
756
+ "model.layers.8.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
757
+ "model.layers.8.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
758
+ "model.layers.8.self_attn.k_proj.bias": "model-00003-of-00015.safetensors",
759
+ "model.layers.8.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
760
+ "model.layers.8.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
761
+ "model.layers.8.self_attn.q_proj.bias": "model-00003-of-00015.safetensors",
762
+ "model.layers.8.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
763
+ "model.layers.8.self_attn.v_proj.bias": "model-00003-of-00015.safetensors",
764
+ "model.layers.8.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
765
+ "model.layers.9.input_layernorm.weight": "model-00003-of-00015.safetensors",
766
+ "model.layers.9.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
767
+ "model.layers.9.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
768
+ "model.layers.9.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
769
+ "model.layers.9.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
770
+ "model.layers.9.self_attn.k_proj.bias": "model-00003-of-00015.safetensors",
771
+ "model.layers.9.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
772
+ "model.layers.9.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
773
+ "model.layers.9.self_attn.q_proj.bias": "model-00003-of-00015.safetensors",
774
+ "model.layers.9.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
775
+ "model.layers.9.self_attn.v_proj.bias": "model-00003-of-00015.safetensors",
776
+ "model.layers.9.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
777
+ "model.norm.weight": "model-00015-of-00015.safetensors"
778
+ }
779
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<seed:bos>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<seed:eos>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<seed:pad>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
thinking_budget.png ADDED

Git LFS Details

  • SHA256: 3acc41ada52728454bd060fe5eddc8d3d0495cfdbdcb9eff41a0f49582442ab5
  • Pointer size: 131 Bytes
  • Size of remote file: 190 kB
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6bd848f52451824a3033a9f1e67eea5b399a13c90f845a332d3a29537e05827
3
+ size 11883696
tokenizer_config.json ADDED
@@ -0,0 +1,1038 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<seed:bos>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<seed:pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "<seed:eos>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<seed:think>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "4": {
36
+ "content": "</seed:think>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "5": {
44
+ "content": "<seed:cot_budget_reflect>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "6": {
52
+ "content": "</seed:cot_budget_reflect>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "7": {
60
+ "content": "<seed:tool_call>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "8": {
68
+ "content": "</seed:tool_call>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "9": {
76
+ "content": "<[PLHD9_never_used]>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "10": {
84
+ "content": "<[PLHD10_never_used]>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "11": {
92
+ "content": "<[PLHD11_never_used]>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "12": {
100
+ "content": "<[PLHD12_never_used]>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "13": {
108
+ "content": "<[PLHD13_never_used]>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "14": {
116
+ "content": "<[PLHD14_never_used]>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "15": {
124
+ "content": "<[PLHD15_never_used]>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "16": {
132
+ "content": "<[PLHD16_never_used]>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "17": {
140
+ "content": "<[PLHD17_never_used]>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "18": {
148
+ "content": "<[PLHD18_never_used]>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "19": {
156
+ "content": "<[PLHD19_never_used]>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "20": {
164
+ "content": "<[PLHD20_never_used]>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "21": {
172
+ "content": "<[PLHD21_never_used]>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "22": {
180
+ "content": "<[PLHD22_never_used]>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "23": {
188
+ "content": "<[PLHD23_never_used]>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "24": {
196
+ "content": "<[PLHD24_never_used]>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "25": {
204
+ "content": "<[PLHD25_never_used]>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "26": {
212
+ "content": "<[PLHD26_never_used]>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "27": {
220
+ "content": "<[PLHD27_never_used]>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "28": {
228
+ "content": "<[PLHD28_never_used]>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "29": {
236
+ "content": "<[PLHD29_never_used]>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "30": {
244
+ "content": "<[PLHD30_never_used]>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "31": {
252
+ "content": "<[PLHD31_never_used]>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "32": {
260
+ "content": "<[PLHD32_never_used]>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "33": {
268
+ "content": "<[PLHD33_never_used]>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "34": {
276
+ "content": "<[PLHD34_never_used]>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "35": {
284
+ "content": "<[PLHD35_never_used]>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "36": {
292
+ "content": "<[PLHD36_never_used]>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "37": {
300
+ "content": "<[PLHD37_never_used]>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "38": {
308
+ "content": "<[PLHD38_never_used]>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "39": {
316
+ "content": "<[PLHD39_never_used]>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "40": {
324
+ "content": "<[PLHD40_never_used]>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "41": {
332
+ "content": "<[PLHD41_never_used]>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "42": {
340
+ "content": "<[PLHD42_never_used]>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "43": {
348
+ "content": "<[PLHD43_never_used]>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "44": {
356
+ "content": "<[PLHD44_never_used]>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "45": {
364
+ "content": "<[PLHD45_never_used]>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "46": {
372
+ "content": "<[PLHD46_never_used]>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "47": {
380
+ "content": "<[PLHD47_never_used]>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "48": {
388
+ "content": "<[PLHD48_never_used]>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "49": {
396
+ "content": "<[PLHD49_never_used]>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "50": {
404
+ "content": "<[PLHD50_never_used]>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "51": {
412
+ "content": "<[PLHD51_never_used]>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "52": {
420
+ "content": "<[PLHD52_never_used]>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "53": {
428
+ "content": "<[PLHD53_never_used]>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "54": {
436
+ "content": "<[PLHD54_never_used]>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "55": {
444
+ "content": "<[PLHD55_never_used]>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "56": {
452
+ "content": "<[PLHD56_never_used]>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "57": {
460
+ "content": "<[PLHD57_never_used]>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "58": {
468
+ "content": "<[PLHD58_never_used]>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "59": {
476
+ "content": "<[PLHD59_never_used]>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "60": {
484
+ "content": "<[PLHD60_never_used]>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "61": {
492
+ "content": "<[PLHD61_never_used]>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "62": {
500
+ "content": "<[PLHD62_never_used]>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "63": {
508
+ "content": "<[PLHD63_never_used]>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "64": {
516
+ "content": "<[PLHD64_never_used]>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "65": {
524
+ "content": "<[PLHD65_never_used]>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "66": {
532
+ "content": "<[PLHD66_never_used]>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "67": {
540
+ "content": "<[PLHD67_never_used]>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "68": {
548
+ "content": "<[PLHD68_never_used]>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "69": {
556
+ "content": "<[PLHD69_never_used]>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "70": {
564
+ "content": "<[PLHD70_never_used]>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "71": {
572
+ "content": "<[PLHD71_never_used]>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "72": {
580
+ "content": "<[PLHD72_never_used]>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "73": {
588
+ "content": "<[PLHD73_never_used]>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "74": {
596
+ "content": "<[PLHD74_never_used]>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "75": {
604
+ "content": "<[PLHD75_never_used]>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "76": {
612
+ "content": "<[PLHD76_never_used]>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "77": {
620
+ "content": "<[PLHD77_never_used]>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "78": {
628
+ "content": "<[PLHD78_never_used]>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "79": {
636
+ "content": "<[PLHD79_never_used]>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "80": {
644
+ "content": "<[PLHD80_never_used]>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "81": {
652
+ "content": "<[PLHD81_never_used]>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "82": {
660
+ "content": "<[PLHD82_never_used]>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "83": {
668
+ "content": "<[PLHD83_never_used]>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "84": {
676
+ "content": "<[PLHD84_never_used]>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "85": {
684
+ "content": "<[PLHD85_never_used]>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "86": {
692
+ "content": "<[PLHD86_never_used]>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "87": {
700
+ "content": "<[PLHD87_never_used]>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "88": {
708
+ "content": "<[PLHD88_never_used]>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "89": {
716
+ "content": "<[PLHD89_never_used]>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "90": {
724
+ "content": "<[PLHD90_never_used]>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "91": {
732
+ "content": "<[PLHD91_never_used]>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "92": {
740
+ "content": "<[PLHD92_never_used]>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "93": {
748
+ "content": "<[PLHD93_never_used]>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "94": {
756
+ "content": "<[PLHD94_never_used]>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "95": {
764
+ "content": "<[PLHD95_never_used]>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "96": {
772
+ "content": "<[PLHD96_never_used]>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "97": {
780
+ "content": "<[PLHD97_never_used]>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "98": {
788
+ "content": "<[PLHD98_never_used]>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "99": {
796
+ "content": "<[PLHD99_never_used]>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "100": {
804
+ "content": "<[PLHD100_never_used]>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "101": {
812
+ "content": "<[PLHD101_never_used]>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "102": {
820
+ "content": "<[PLHD102_never_used]>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "103": {
828
+ "content": "<[PLHD103_never_used]>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "104": {
836
+ "content": "<[PLHD104_never_used]>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "105": {
844
+ "content": "<[PLHD105_never_used]>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "106": {
852
+ "content": "<[PLHD106_never_used]>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "107": {
860
+ "content": "<[PLHD107_never_used]>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "108": {
868
+ "content": "<[PLHD108_never_used]>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "109": {
876
+ "content": "<[PLHD109_never_used]>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "110": {
884
+ "content": "<[PLHD110_never_used]>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "111": {
892
+ "content": "<[PLHD111_never_used]>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "112": {
900
+ "content": "<[PLHD112_never_used]>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "113": {
908
+ "content": "<[PLHD113_never_used]>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "114": {
916
+ "content": "<[PLHD114_never_used]>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "115": {
924
+ "content": "<[PLHD115_never_used]>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "116": {
932
+ "content": "<[PLHD116_never_used]>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "117": {
940
+ "content": "<[PLHD117_never_used]>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "118": {
948
+ "content": "<[PLHD118_never_used]>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "119": {
956
+ "content": "<[PLHD119_never_used]>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "120": {
964
+ "content": "<[PLHD120_never_used]>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "121": {
972
+ "content": "<[PLHD121_never_used]>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "122": {
980
+ "content": "<[PLHD122_never_used]>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "123": {
988
+ "content": "<[PLHD123_never_used]>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "124": {
996
+ "content": "<[PLHD124_never_used]>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "125": {
1004
+ "content": "<[PLHD125_never_used]>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "126": {
1012
+ "content": "<[PLHD126_never_used]>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "127": {
1020
+ "content": "<[PLHD127_never_used]>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ }
1027
+ },
1028
+ "bos_token": "<seed:bos>",
1029
+ "clean_up_tokenization_spaces": false,
1030
+ "eos_token": "<seed:eos>",
1031
+ "extra_special_tokens": {},
1032
+ "model_max_length": 1000000000000000019884624838656,
1033
+ "pad_token": "<seed:pad>",
1034
+ "padding_side": "left",
1035
+ "tokenizer_class": "PreTrainedTokenizerFast",
1036
+ "unk_token": null,
1037
+ "chat_template": "{# Unsloth Chat template fixes #}\n{# ----------‑‑‑ special token variables ‑‑‑---------- #}\n{%- set bos_token = '<seed:bos>' -%}\n{%- set eos_token = '<seed:eos>' -%}\n{%- set pad_token = '<seed:pad>' -%}\n{%- set toolcall_begin_token = '<seed:tool_call>' -%}\n{%- set toolcall_end_token = '</seed:tool_call>' -%}\n{%- set think_begin_token = '<seed:think>' -%}\n{%- set think_end_token = '</seed:think>' -%}\n{%- set budget_begin_token = '<seed:cot_budget_reflect>'-%}\n{%- set budget_end_token = '</seed:cot_budget_reflect>'-%}\n{# -------------- reflection-interval lookup -------------- #}\n{%- if not thinking_budget is defined %}\n{%- set thinking_budget = -1 -%}\n{%- endif -%}\n{%- set budget_reflections_v05 = {\n 0: 0,\n 512: 128,\n 1024: 256,\n 2048: 512,\n 4096: 512,\n 8192: 1024,\n 16384: 1024\n} -%}\n{# 找到 “大于等于 thinking_budget” 的第一个档位 #}\n{%- set ns = namespace(interval = None) -%}\n{%- for k, v in budget_reflections_v05 | dictsort -%}\n {%- if ns.interval is none and thinking_budget <= k -%}\n {%- set ns.interval = v -%}\n {%- endif -%}\n{%- endfor -%}\n{# 若超过最大档位,则用最后一个档位的值 #}\n{%- if ns.interval is none -%}\n {%- set ns.interval = budget_reflections_v05[16384] -%}\n{%- endif -%}\n{# ---------- 预处理 system 消息 ---------- #}\n{%- if messages[0][\"role\"] == \"system\" %}\n{%- set system_message = messages[0][\"content\"] %}\n{%- set loop_messages = messages[1:] %}\n{%- else %}\n{%- set loop_messages = messages %}\n{%- endif %}\n{# ---------- 确保 tools 存在 ---------- #}\n{%- if not tools is defined or tools is none %}\n{%- set tools = [] %}\n{%- endif %}\n{# tools2doc.jinja #}\n{%- macro py_type(t) -%}\n {%- if t == \"string\" -%}str\n {%- elif t in (\"number\", \"integer\") -%}int\n {%- elif t == \"boolean\" -%}bool\n {%- elif t == \"array\" -%}list\n {%- else -%}Any{%- endif -%}\n{%- endmacro -%}\n{# ---------- 输出 system 块 ---------- #}\n{%- if system_message is defined %}\n{{ bos_token + \"system\\n\" + system_message }}\n{%- else %}\n{%- if tools is iterable and tools | length > 0 %}\n{{ bos_token + \"system\\nYou are Doubao, a helpful AI assistant. You may call one or more functions to assist with the user query.\" }}\n{%- endif %}\n{%- endif %}\n{%- if use_json_tooldef is defined and use_json_tooldef %}\n\n{{\"Tool List:\\nYou are authorized to use the following tools (described in JSON Schema format). Before performing any task, you must decide how to call them based on the descriptions and parameters of these tools.\"}}\n{{ tools | tojson|string }}\n{%- else %}\n{%- for item in tools if item.type == \"function\" %}\n\n\nFunction:\ndef {{ item.function.name }}(\n{%- for name, spec in item.function.parameters.properties.items() %}\n {{- name }}: {{ py_type(spec.type) }}{% if not loop.last %},{% endif %}\n{%- endfor %}):\n \"\"\"\n {{ item.function.description | trim }}\n\n {# ---------- Args ---------- #}\n {%- if item.function.parameters.properties %}\n Args:\n {%- for name, spec in item.function.parameters.properties.items() %}\n\n - {{ name }} ({{ py_type(spec.type) }})\n {%- if name in item.function.parameters.required %} [必填]{% else %} [选填]{% endif %}:\n {{- \" \" ~ (spec.description or \"\") }}\n {%- endfor %}\n {%- endif %}\n\n {# ---------- Returns ---------- #}\n {%- if item.function.returns is defined\n and item.function.returns.properties is defined\n and item.function.returns.properties %}\n Returns:\n {%- for name, spec in item.function.returns.properties.items() %}\n\n - {{ name }} ({{ py_type(spec.type) }}):\n {{- \" \" ~ (spec.description or \"\") }}\n {%- endfor %}\n {%- endif %}\n\n \"\"\"\n{%- endfor %}\n{%- endif %}\n{%- if tools is iterable and tools | length > 0 %}\n\n{{\"工具调用请遵循如下格式:\\n<seed:tool_call>\\n<function=example_function_name>\\n<parameter=example_parameter_1>value_1</parameter>\\n<parameter=example_parameter_2>This is the value for the second parameter\\nthat can span\\nmultiple lines</parameter>\\n</function>\\n</seed:tool_call>\\n\"}}\n{%- endif %}\n{# 结束 system 块行尾 #}\n{%- if system_message is defined or tools is iterable and tools | length > 0 %}\n{{ eos_token }}\n{%- endif %}\n{# ---------- Thinking Budget ---------- #}\n{%- if thinking_budget is defined %}\n{%- if thinking_budget == 0 %}\n{{ bos_token+\"system\" }}\n{{ \"You are an intelligent assistant that can answer questions in one step without the need for reasoning and thinking, that is, your thinking budget is 0. Next, please skip the thinking process and directly start answering the user's questions.\" }}\n{{ eos_token }}\n{%- elif not thinking_budget == -1 %}\n{{ bos_token+\"system\" }}\n{{ \"You are an intelligent assistant with reflective ability. In the process of thinking and reasoning, you need to strictly follow the thinking budget, which is \"}}{{thinking_budget}}{{\". That is, you need to complete your thinking within \"}}{{thinking_budget}}{{\" tokens and start answering the user's questions. You will reflect on your thinking process every \"}}{{ns.interval}}{{\" tokens, stating how many tokens have been used and how many are left.\"}}\n{{ eos_token }}\n{%- endif %}\n{%- endif %}\n{# ---------- 逐条写出历史消息 ---------- #}\n{%- for message in loop_messages %}\n{%- if message.role == \"assistant\"\n and message.tool_calls is defined\n and message.tool_calls is iterable\n and message.tool_calls | length > 0 %}\n{{ bos_token + message.role }}\n{%- if message.reasoning_content is defined and message.reasoning_content is string and message.reasoning_content | trim | length > 0 %}\n{{ \"\\n\" + think_begin_token + message.reasoning_content | trim + think_end_token }}\n{%- endif %}\n{%- if message.content is defined and message.content is string and message.content | trim | length > 0 %}\n{{ \"\\n\" + message.content | trim + \"\\n\" }}\n{%- endif %}\n{%- for tool_call in message.tool_calls %}\n{%- if tool_call.function is defined %}{% set tool_call = tool_call.function %}{% endif %}\n{{ \"\\n\" + toolcall_begin_token + \"\\n<function=\" + tool_call.name + \">\\n\" }}\n{%- if tool_call.arguments is defined and tool_call.arguments is mapping %}\n{%- for arg_name, arg_value in tool_call.arguments | items %}\n{{ \"<parameter=\" + arg_name + \">\" }}\n{%- set arg_value = arg_value if arg_value is string else arg_value | string %}\n{{ arg_value+\"</parameter>\\n\" }}\n{%- endfor %}\n{%- endif %}\n{{ \"</function>\\n\" + toolcall_end_token }}\n{%- endfor %}\n{{ eos_token }}\n{%- elif message.role in [\"user\", \"system\"] %}\n{{ bos_token + message.role + \"\\n\" + message.content + eos_token }}\n{%- elif message.role == \"assistant\" %}\n{{ bos_token + message.role }}\n{%- if message.reasoning_content is defined and message.reasoning_content is string and message.reasoning_content | trim | length > 0 %}\n{{ \"\\n\" + think_begin_token + message.reasoning_content | trim + think_end_token }}\n{%- endif %}\n{%- if message.content is defined and message.content is string and message.content | trim | length > 0 %}\n{{ \"\\n\" + message.content | trim + eos_token }}\n{%- endif %}\n{# 包括 tool 角色,在这个逻辑 #}\n{%- else %}\n{{ bos_token + message.role + \"\\n\" + message.content + eos_token }}\n{%- endif %}\n{%- endfor %}\n{# ---------- 控制模型开始续写 ---------- #}\n{%- if add_generation_prompt %}\n{{ bos_token+\"assistant\\n\" }}\n{%- if thinking_budget == 0 %}\n{{ think_begin_token+budget_begin_token }}\n{%- endif %}\n{%- endif %}\n{# Copyright 2025-present Unsloth. Apache 2.0 License. #}"
1038
+ }