Wan2.2-TI2V-5B-Turbo

GitHub HuggingFace HuggingFace

Wan2.2-TI2V-5B-Turbo is designed for efficient step distillation and CFG distillation based on Wan2.2-TI2V-5B.

Leveraging the Self-Forcing framework, it enables 4-step TI2V-5B model training. Our model can generate 121-frame videos at 24 FPS with a resolution of 1280×704 in just 4 steps, eliminating the need for the CFG trick.

To the best of our knowledge, Wan2.2-TI2V-5B-Turbo is the first open-source repository of the distilled I2V version of Wan2.2-TI2V-5B.

🔥Video Demos

The videos below can be reproduced using examples/example.csv.

📣 Updates

  • 2025/08/06 🔥Wan2.2-TI2V-5B-Turbo has been released here.

🐍 Installation

Create a conda environment and install dependencies:

conda create -n wanturbo python=3.10 -y
conda activate wanturbo
pip install -r requirements.txt
pip install flash-attn --no-build-isolation
python setup.py develop

🚀Quick Start

Checkpoint Download

pip install "huggingface_hub[hf_transfer]"
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download Wan-AI/Wan2.2-TI2V-5B --local-dir wan_models/Wan2.2-TI2V-5B
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download quanhaol/Wan2.2-TI2V-5B-Turbo --local-dir wan_models/Wan2.2-TI2V-5B-Turbo

DMD Training

bash running_scripts/train/Wan2.2/dmd.sh

Our training run uses 4000 iterations and completes in under 2 days using 16 A100 GPUs.

Fewstep Inference

bash running_scripts/inference/Wan2.2/i2v_fewstep.sh

🤝 Acknowledgements

We would like to express our gratitude to the following open-source projects that have been instrumental in the development of our project:

Special thanks to the contributors of these libraries for their hard work and dedication!

📚 Contact

If you have any suggestions or find our work helpful, feel free to contact us

Email: [email protected] or [email protected] or [email protected]

If you find our work useful, please consider giving a star to this github repository and citing it:

@article{li2025magicmotion,
  title={MagicMotion: Controllable Video Generation with Dense-to-Sparse Trajectory Guidance},
  author={Li, Quanhao and Xing, Zhen and Wang, Rui and Zhang, Hui and Dai, Qi and Wu, Zuxuan},
  journal={arXiv preprint arXiv:2503.16421},
  year={2025}
}
Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for quanhaol/Wan2.2-TI2V-5B-Turbo

Finetuned
(8)
this model

Dataset used to train quanhaol/Wan2.2-TI2V-5B-Turbo