nintwentydo commited on
Commit
715cc5f
·
verified ·
1 Parent(s): 7dfea76

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +56 -0
README.md ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - mistralai/Pixtral-12B-2409
4
+ - TheDrummer/UnslopNemo-12B-v3
5
+ base_model_relation: merge
6
+ library_name: transformers
7
+ tags:
8
+ - mergekit
9
+ - merge
10
+ - multimodal
11
+ - mistral
12
+ - pixtral
13
+ language:
14
+ - en
15
+ - fr
16
+ - de
17
+ - es
18
+ - it
19
+ - pt
20
+ - ru
21
+ - zh
22
+ - ja
23
+ license: other
24
+ pipeline_tag: image-text-to-text
25
+ ---
26
+
27
+ # Razorback 12B v0.2
28
+
29
+ <img src="https://huggingface.co/nintwentydo/Razorback-12B-v0.1/resolve/main/razorback.jpg" style="width: 100%; max-width:700px"></img>
30
+
31
+ A more robust attempt at merging TheDrummer's UnslopNemo v3 into Pixtral 12B.
32
+
33
+ Has been really stable in my testing so far. Needs more testing to see what samplers it does/doesn't like.
34
+
35
+ Seems to be the best of both worlds - less sloppy, more engaging content and decent intelligence / visual understanding.
36
+
37
+
38
+ ## Merging Approach
39
+ First, I loaded up Pixtral 12B Base and Mistral Nemo Base to compare their parameter differences.
40
+ Looking at the L2 norm / relative difference values I was able to isolate which parts of Pixtral 12B are a significant deviation from Mistral Nemo.
41
+ Because while the language model architecture is the same between the two, a lot of vision understanding has been trained into Pixtral's language model and can break very easily.
42
+
43
+ Then I calculated merging weights for each parameter using an exponential falloff. The smaller the difference, the higher the weight.
44
+
45
+ Applied this recipe to Pixtral Instruct (Pixtral-12B-2409) and TheDrummer's UnslopNemo-12B-v3. The goal is to infuse as much Drummer goodness without breaking vision input. And it looks like it's worked!
46
+
47
+
48
+ ## Usage
49
+ Needs more testing to identify best sampling params, but so far just using ~0.7 temp + 0.03 min p has been rock solid.
50
+
51
+ Use the included chat template (Mistral). No chatml support yet.
52
+
53
+ ## Credits
54
+ - Mistral for [mistralai/Pixtral-12B-2409](https://huggingface.co/mistralai/Pixtral-12B-2409)
55
+ - Unsloth for [unsloth/Pixtral-12B-2409](https://huggingface.co/unsloth/Pixtral-12B-2409) transformers conversion
56
+ - TheDrummer for [TheDrummer/UnslopNemo-12B-v3](https://huggingface.co/TheDrummer/UnslopNemo-12B-v3)