|
--- |
|
inference: false |
|
license: other |
|
license_name: microsoft-research-license |
|
license_link: https://huggingface.co/WizardLM/WizardMath-7B-V1.1/resolve/main/LICENSE |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
|
|
## WizardMath: Empowering Mathematical Reasoning for Large Language Models via Reinforced Evol-Instruct (RLEIF) |
|
|
|
<p style="font-size:28px;" align="center"> |
|
🏠 <a href="https://wizardlm.github.io/" target="_blank">Home Page</a> </p> |
|
<p align="center"> |
|
<p align="center"> |
|
🤗 <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> •🐱 <a href="https://github.com/nlpxucan/WizardLM" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> • 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> • 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a> <br> |
|
</p> |
|
<p align="center"> |
|
👋 Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a> |
|
</p> |
|
|
|
|
|
|
|
| Model | Checkpoint | Paper | GSM8k | MATH |Online Demo| License| |
|
| ----- |------| ---- |------|-------| ----- | ----- | |
|
| **WizardMath-7B-V1.0** | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-7B-V1.1" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **83.24** | **30.0** | [Demo](http://47.103.63.15:50080/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank"> </a>| |
|
| WizardMath-70B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-70B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **81.6** | **22.7** |[Demo](http://47.103.63.15:50083/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> | |
|
| WizardMath-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **63.9** | **14.0** |[Demo](http://47.103.63.15:50082/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> | |
|
| WizardMath-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **54.9** | **10.7** | [Demo](http://47.103.63.15:50080/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a>| |
|
|
|
|
|
|
|
## [12/19/2023] Comparing WizardMath-7B-V1.1 with Other 7B size math LLMs. |
|
|
|
🔥 |
|
❗<b>Note for model system prompts usage:</b> |
|
|
|
Please use **the same systems prompts strictly** with us, and we do not guarantee the accuracy of the **quantified versions**. |
|
|
|
**Default version:** |
|
|
|
``` |
|
"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response:" |
|
``` |
|
|
|
|
|
**CoT Version:** (❗For the **simple** math questions, we do NOT recommend to use the CoT prompt.) |
|
|
|
|
|
``` |
|
"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response: Let's think step by step." |
|
``` |
|
|
|
## Inference WizardMath Demo Script |
|
|
|
We provide the WizardMath inference demo code [here](https://github.com/nlpxucan/WizardLM/tree/main/demo). |
|
|
|
|
|
|
|
|
|
## Citation |
|
|
|
Please cite the repo if you use the data, method or code in this repo. |
|
|
|
``` |
|
@article{luo2023wizardmath, |
|
title={WizardMath: Empowering Mathematical Reasoning for Large Language Models via Reinforced Evol-Instruct}, |
|
author={Luo, Haipeng and Sun, Qingfeng and Xu, Can and Zhao, Pu and Lou, Jianguang and Tao, Chongyang and Geng, Xiubo and Lin, Qingwei and Chen, Shifeng and Zhang, Dongmei}, |
|
journal={arXiv preprint arXiv:2308.09583}, |
|
year={2023} |
|
} |
|
``` |
|
|