File size: 1,849 Bytes
df0f21f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
---
tags:
- llama
- alpaca
- cot
- vicuna
- uncensored
- merge
- mix
---

## 13B-BlueMethod

## Composition:
BlueMethod is a bit of a convoluted experiment in tiered merging.
Furthering the experimental nature of the merge, the models combined
were done so with a custom script that randomized the percent of each
layer merged from one model to the next. This is a warmup for a larger
project.
[Tier One and Two Merges not released; internal naming convention]

Tier One Merges:
13B-Metharme+13B-Nous-Hermes=13B-Methermes
13B-Vicuna-cocktail+13B-Manticore=13B-Vicortia
13B-HyperMantis+13B-Alpacino=13B-PsychoMantis

Tier Two Merges:
13B-Methermes+13B-Vicortia=13B-Methphistopheles
13B-PsychoMantis+13B-BlueMoonRP=13B-BlueMantis

Tier Three Merge:
13B-Methphistopheles+13B-BlueMantis=13B-BlueMethod

## Use:
Multiple instruct models and model composites were combined to make the final resulting model;
This model is highly open to experimental prompting, both Alpaca and Vicuna instruct can be used.
It can have interesting results.

## Language Models and LoRAs Used Credits:

13B-Metharme by PygmalionAI

https://www.huggingface.co/PygmalionAI/metharme-13b

13B-Nous-Hermes by NousResearch

https://www.huggingface.co/NousResearch/Nous-Hermes-13b

13B-Vicuna-cocktail by reeducator

https://www.huggingface.co/reeducator/vicuna-13b-cocktail

13B-Manticore by openaccess-ai-collective

https://www.huggingface.co/openaccess-ai-collective/manticore-13b

13B-HyperMantis and 13B-Alpacino by Digitous

https://huggingface.co/digitous/13B-HyperMantis
https://huggingface.co/digitous/Alpacino13b

Also thanks to Meta for LLaMA.

Each model and LoRA was hand picked and considered for what it could contribute to this ensemble.
Thanks to each and every one of you for your incredible work developing some of the best things
to come out of this community.