CodeCalc-Mistral-7B / README.md
sethuiyer's picture
Update README.md
1d41dcb verified
|
raw
history blame
686 Bytes
---
base_model:
- uukuguy/speechless-code-mistral-7b-v1.0
- upaya07/Arithmo2-Mistral-7B
library_name: transformers
tags:
- mergekit
- merge
---
# CodeCalc-Mistral-7B
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: uukuguy/speechless-code-mistral-7b-v1.0
dtype: bfloat16
merge_method: ties
models:
- model: uukuguy/speechless-code-mistral-7b-v1.0
- model: upaya07/Arithmo2-Mistral-7B
parameters:
density: [0.25, 0.35, 0.45, 0.35, 0.25]
weight: [0.1, 0.25, 0.5, 0.25, 0.1]
parameters:
int8_mask: true
```