machinez commited on
Commit
673306e
·
verified ·
1 Parent(s): f6075ef

Update README.md

Browse files

more weights. tag. misspelling

Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -5,6 +5,8 @@ license_name: mnpl
5
  license_link: https://mistral.ai/licences/MNPL-0.1.md
6
  tags:
7
  - code
 
 
8
  language:
9
  - code
10
  ---
@@ -27,17 +29,17 @@ python3 convert.py -i ~/models/Codestral-22B-v0.1-hf/ -o /tmp/exl2/ -nr -m ~/mod
27
  ```
28
 
29
  ## Quantization
30
- - [3.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/3_0)
31
  - [4.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/4_0) 11.5gb
32
  - [5.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/5_0) 14.0gb
33
- - [5.5bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/5_5)
34
  - [6.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/6_0) 17.0gb
35
  - [7.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/7_0) 19.7gb
36
  - [8.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/8_0) 21.0gb
37
 
38
  # Model Card for Codestral-22B-v0.1
39
 
40
- Codestrall-22B-v0.1 is trained on a diverse dataset of 80+ programming languages, including the most popular ones, such as Python, Java, C, C++, JavaScript, and Bash (more details in the [Blogpost](https://mistral.ai/news/codestral/)). The model can be queried:
41
  - As instruct, for instance to answer any questions about a code snippet (write documentation, explain, factorize) or to generate code following specific indications
42
  - As Fill in the Middle (FIM), to predict the middle tokens between a prefix and a suffix (very useful for software development add-ons like in VS Code)
43
 
 
5
  license_link: https://mistral.ai/licences/MNPL-0.1.md
6
  tags:
7
  - code
8
+ - text-generation-inference
9
+ - mistral
10
  language:
11
  - code
12
  ---
 
29
  ```
30
 
31
  ## Quantization
32
+ - [3.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/3_0) 8.75gb
33
  - [4.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/4_0) 11.5gb
34
  - [5.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/5_0) 14.0gb
35
+ - [5.5bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/5_5) 15.6gb
36
  - [6.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/6_0) 17.0gb
37
  - [7.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/7_0) 19.7gb
38
  - [8.0bpw](https://huggingface.co/machinez/Codestral-22B-v0.1-exl2/tree/8_0) 21.0gb
39
 
40
  # Model Card for Codestral-22B-v0.1
41
 
42
+ Codestral-22B-v0.1 is trained on a diverse dataset of 80+ programming languages, including the most popular ones, such as Python, Java, C, C++, JavaScript, and Bash (more details in the [Blogpost](https://mistral.ai/news/codestral/)). The model can be queried:
43
  - As instruct, for instance to answer any questions about a code snippet (write documentation, explain, factorize) or to generate code following specific indications
44
  - As Fill in the Middle (FIM), to predict the middle tokens between a prefix and a suffix (very useful for software development add-ons like in VS Code)
45