File size: 1,288 Bytes
5c21501 |
1 2 3 4 5 6 7 8 9 |
| Method Name | Description | Use Case / Notes |
| --------------------------------- | ------------------------------------------------------------------- | -------------------------------------------------- |
| **Full Fine-Tuning** | Train all weights of the pretrained model on your dataset | Best for large datasets, very GPU intensive |
| **Feature Extraction** | Freeze the backbone (encoder) and train only the decoder / head | Good for small datasets, low GPU |
| **LoRA (Low-Rank Adaptation)** | Adds small trainable adapter layers to pretrained attention layers | Extremely memory-efficient, works on mini datasets |
| **DreamBooth** | Fine-tune Stable Diffusion to generate **custom subjects / styles** | Specialized for image personalization |
| **Adapter Tuning** | Insert small adapter modules in transformer layers | Similar to LoRA but more modular |
| **Prompt Tuning / Prefix Tuning** | Train embeddings / tokens without changing main model weights | Works well for text & multimodal models |
|