nextai-team's picture
Update README.md
6b00924 verified
|
raw
history blame
801 Bytes
---
language:
- en
license: apache-2.0
library_name: transformers
tags:
- code
- QA
- reasoning
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
A powerfull MOE 4x7b mixtral of mistral models build using
HuggingFaceH4/zephyr-7b-beta,
mistralai/Mistral-7B-Instruct-v0.2,
teknium/OpenHermes-2.5-Mistral-7B,
Intel/neural-chat-7b-v3-3
for more accuracy and precision in general reasoning, QA and code.
- **Developed by:** NEXT AI
- **Funded by :** Zpay Labs Pvt Ltd.
- **Model type:** Mixtral of Mistral 4x7b
- **Language(s) (NLP):** Code-Reasoning-QA
-
### Model Sources
<!-- Provide the basic links for the model. -->
- **Demo :** Https://nextai.co.in