Update README.md
Browse files
README.md
CHANGED
@@ -2,41 +2,56 @@
|
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
|
5 |
-
## Converted LoRAs
|
6 |
-
- [Flux.1 Kontext Deblur](https://civitaiarchive.com/models/1737381)
|
7 |
-
- [Flux.1 Kontext Face Detailer](https://civitaiarchive.com/models/1752776)
|
8 |
|
9 |
-
|
10 |
-
|
|
|
|
|
11 |
|
12 |
-
https://gist.github.com/akedia/e0a132b587e30413665d299ad893a60e
|
13 |
|
14 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15 |
🔄 Universal final_layer.adaLN LoRA patcher (.safetensors)
|
|
|
16 |
Enter path to input LoRA .safetensors file: Flux_kontext_deblur.safetensors
|
17 |
Enter path to save patched LoRA .safetensors file: flux_kontext_deblur_comfyui.safetensors
|
18 |
|
19 |
✅ Loaded 610 tensors from: Flux_kontext_deblur.safetensors
|
20 |
|
21 |
-
🔑 Found
|
22 |
-
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
-
🔍 Checking for final_layer keys with prefix: 'lora_unet_final_layer'
|
26 |
-
Linear down: lora_unet_final_layer_linear.lora_down.weight
|
27 |
-
Linear up: lora_unet_final_layer_linear.lora_up.weight
|
28 |
-
✅ Has final_layer.linear: True
|
29 |
-
✅ Has final_layer.adaLN_modulation_1: False
|
30 |
✅ Added dummy adaLN weights:
|
31 |
-
|
32 |
-
|
33 |
|
34 |
✅ Patched file saved to: flux_kontext_deblur_comfyui.safetensors
|
35 |
Total tensors now: 612
|
36 |
|
37 |
🔍 Verifying patched keys:
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
|
|
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
|
5 |
+
## ✅ Converted LoRAs
|
|
|
|
|
6 |
|
7 |
+
| Converted LoRAs | Original LoRAs |
|
8 |
+
|-----------------|----------------|
|
9 |
+
| [flux_kontext_deblur_comfyui.safetensors](https://huggingface.co/lym00/comfyui_nunchaku_lora_conversion/blob/main/flux_kontext_deblur_comfyui.safetensors) | [Flux.1 Kontext Deblur](https://civitaiarchive.com/models/1737381) |
|
10 |
+
| [flux_kontext_face_detailer_comfyui.safetensors](https://huggingface.co/lym00/comfyui_nunchaku_lora_conversion/blob/main/flux_kontext_face_detailer_comfyui.safetensors) | [Flux.1 Kontext Face Detailer](https://civitaiarchive.com/models/1752776) |
|
11 |
|
|
|
12 |
|
13 |
+
## 🛠️ Conversion References
|
14 |
+
**Script:** [convert_to_comfyui_lora.py](https://huggingface.co/lym00/comfyui_nunchaku_lora_conversion/blob/main/convert_to_comfyui_lora.py)
|
15 |
+
|
16 |
+
Based on
|
17 |
+
- **Nunchaku Issue:** [ComfyUI-nunchaku #340](https://github.com/mit-han-lab/ComfyUI-nunchaku/issues/340)
|
18 |
+
- **Example Gist:** [akedia/e0a132b5...](https://gist.github.com/akedia/e0a132b587e30413665d299ad893a60e)
|
19 |
+
|
20 |
+
---
|
21 |
+
|
22 |
+
## 🔄 Example Conversion Log
|
23 |
+
|
24 |
+
```bash
|
25 |
+
Running convert_to_comfyui_lora.py
|
26 |
+
|
27 |
🔄 Universal final_layer.adaLN LoRA patcher (.safetensors)
|
28 |
+
|
29 |
Enter path to input LoRA .safetensors file: Flux_kontext_deblur.safetensors
|
30 |
Enter path to save patched LoRA .safetensors file: flux_kontext_deblur_comfyui.safetensors
|
31 |
|
32 |
✅ Loaded 610 tensors from: Flux_kontext_deblur.safetensors
|
33 |
|
34 |
+
🔑 Found final_layer-related keys:
|
35 |
+
- lora_unet_final_layer_linear.lora_down.weight
|
36 |
+
- lora_unet_final_layer_linear.lora_up.weight
|
37 |
+
|
38 |
+
🔍 Checking for final_layer keys with prefix 'lora_unet_final_layer'
|
39 |
+
Linear down: lora_unet_final_layer_linear.lora_down.weight
|
40 |
+
Linear up: lora_unet_final_layer_linear.lora_up.weight
|
41 |
+
✅ Has final_layer.linear: True
|
42 |
+
✅ Has final_layer.adaLN_modulation_1: False
|
43 |
|
|
|
|
|
|
|
|
|
|
|
44 |
✅ Added dummy adaLN weights:
|
45 |
+
- lora_unet_final_layer_adaLN_modulation_1.lora_down.weight (shape: torch.Size([16, 3072]))
|
46 |
+
- lora_unet_final_layer_adaLN_modulation_1.lora_up.weight (shape: torch.Size([64, 16]))
|
47 |
|
48 |
✅ Patched file saved to: flux_kontext_deblur_comfyui.safetensors
|
49 |
Total tensors now: 612
|
50 |
|
51 |
🔍 Verifying patched keys:
|
52 |
+
- lora_unet_final_layer_adaLN_modulation_1.lora_down.weight
|
53 |
+
- lora_unet_final_layer_adaLN_modulation_1.lora_up.weight
|
54 |
+
- lora_unet_final_layer_linear.lora_down.weight
|
55 |
+
- lora_unet_final_layer_linear.lora_up.weight
|
56 |
+
|
57 |
+
✅ Contains adaLN after patch: True
|