| language: | |
| - sv | |
| - en | |
| license: mit | |
| tags: | |
| - pretrained | |
| pipeline_tag: text-generation | |
| widget: | |
| - text: Jag tycker att det är roligt med | |
| # 🐈⬛ Mistral-7B-v0.1-flashback-v2 | |
|  | |
| Mistral-7B-v0.1-flashback-v2 model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org. | |
| It is a full finetune for one epoch. | |