DragMesh: Interactive 3D Generation Made Easy
Official repository for the paper DragMesh: Interactive 3D Generation Made Easy.
\ud83c\udf10 Project Website | \ud83d\udcbb Code | \ud83d\udcda Paper
GAPartNet (link above) is the canonical dataset source for all articulated assets used in DragMesh.
https://github.com/user-attachments/assets/428b0d36-50ab-4b46-ab17-679ad22c826b
✨ Introduction
While generative models have excelled at creating static 3D content, the pursuit of systems that understand how objects move and respond to interactions remains a fundamental challenge. We present DragMesh, a robust framework for real-time interactive 3D articulation built around a lightweight motion generation core. Our core contribution is a novel decoupled kinematic reasoning and motion generation framework, leveraging dual quaternions and FiLM conditioning to enable plausible, generative articulation on novel objects without retraining. This decoupled design allows DragMesh to achieve real-time performance, offering a practical step toward generative 3D intelligence.
⚡ Quick Start
🧩 Environment Setup
We ship a full Conda specification in environment.yml (environment name: dragmesh). It targets Python 3.10, CUDA 12.1, and PyTorch 2.4.1. Create or update via:
conda env create -f environment.yml
conda activate dragmesh
# or update an existing env
conda env update -f environment.yml --prune
The spec already installs trimesh, pyrender, pygltflib, viser, Objaverse, SAPIEN, pytorch3d, and tiny-cuda-nn. If you prefer a minimal setup, install those packages manually before running the scripts.
🛠️ Native Extensions
Chamfer distance kernels are required for the VAE loss. Clone and build the upstream project:
git clone https://github.com/ThibaultGROUEIX/ChamferDistancePytorch.git
cd ChamferDistancePytorch
python setup.py install
cd ..
Custom Mesh Manipulation (manual input)
You can manipulate custom meshes by supplying drag points/vectors directly through the CLI (no viewer UI). Use --manual_joint_type revolute or --manual_joint_type prismatic to force a specific motion family when needed.
python inference_pipeline.py \
--mesh_file assets/cabinet.obj \
--mask_file assets/cabinet_vertex_labels.npy \
--mask_format vertex \
--drag_point 0.12,0.48,0.05 \ # example: x,y,z point on the movable part
--drag_vector 0.0,0.0,0.2 \ # example: direction+magnitude of the drag
--manual_joint_type revolute \
--kpp_checkpoint best_model_kpp.pth \
--vae_checkpoint best_model.pth \
--output_dir outputs/cabinet_demo \
--num_samples 3
🎬 Demo Gallery
Translational drags
Rotational drags
Self-spin / free-spin
🧾 Citation
If you find DragMesh helpful, please cite:
@article{zhang2025dragmesh,
title={DragMesh: Interactive 3D Generation Made Easy},
author={Zhang, Tianshan and Zhang, Zeyu and Tang, Hao},
journal={arXiv preprint arXiv:2512.06424},
year={2025}
}
🙏 Acknowledgement
We thank the GAPartNet team for the articulated dataset, and upstream projects such as ChamferDistancePytorch, Objaverse, SAPIEN, and PyTorch3D for their open-source contributions.