Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
mistralai
/
Mixtral-8x7B-Instruct-v0.1
like
4.54k
Follow
Mistral AI_
11.7k
Safetensors
5 languages
vllm
mixtral
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
247
Deploy
Add MOE (mixture of experts) tag
#90
by
davanstrien
HF Staff
- opened
Jan 13, 2024
base:
refs/heads/main
←
from:
refs/pr/90
Discussion
Files changed
+2
-0
davanstrien
Jan 13, 2024
No description provided.
Add MOE (mixture of experts) tag
a69b995c
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Cannot merge
This branch has merge conflicts in the following files:
README.md
Comment
·
Sign up
or
log in
to comment