Skip to content
/ DMM Public

DMM: Building a Versatile Image Generation Model via Distillation-Based Model Merging

Notifications You must be signed in to change notification settings

MCG-NJU/DMM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DMM: Building a Versatile Image Generation Model via Distillation-Based Model Merging

arXiv arXiv

Official repository of our paper Building a Versatile Image Generation Model via Distillation-Based Model Merging.

Introduction

We propose a score distillation based model merging paradigm DMM, compressing multiple models into a single versatile T2I model.

Checkpoints

HuggingFace🤗: https://huggingface.co/MCG-NJU/DMM.

Usage

Install required packages with:

pip install -r requirements.txt

and initialize an Accelerate environment with:

accelerate config

An example of a training launch is in train.sh:

sh train.sh

An example of inference script is in inference.py:

python inference.py

Visualization

Results

Results combined with charactor LoRA

Results of interpolation between two styles

TODO

  • Pre-training code.
  • Model weight release.
  • Incremental training code.
  • Inference code with Diffusers.
  • Journeydb dataset code.
  • Evaluation code.
  • Online demo.
  • ComfyUI plugins.

Reference

@article{song2025dmm,
  title={DMM: Building a Versatile Image Generation Model via Distillation-Based Model Merging},
  author={Song, Tianhui and Feng, Weixin and Wang, Shuai and Li, Xubin and Ge, Tiezheng and Zheng, Bo and Wang, Limin},
  journal={arXiv preprint arXiv:2504.12364},
  year={2025}
}

About

DMM: Building a Versatile Image Generation Model via Distillation-Based Model Merging

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published