(a) Results guided by SD
(b) Results guided by MVDream
The GaussianDreamer extension for threestudio. This extension is writen by Xinhua Cheng. To use it, please install threestudio first and then install this extension in threestudio custom
directory.
cd custom
git clone https://github.com/cxh0519/threestudio-gaussiandreamer.git
cd threestudio-gaussiandreamer
git clone --recursive https://github.com/ashawkey/diff-gaussian-rasterization
git clone https://github.com/DSaurus/simple-knn.git
pip install ./diff-gaussian-rasterization
pip install ./simple-knn
pip install open3d
# If you want to export mesh, please install pymeshlab
pip install pymeshlab
Please also install MVDream extension, shap-e extension and my version of lrm extension in threestudio custom
directory with their instructions.
# SD2.1 + shap-e initialize
python launch.py --config custom/threestudio-gaussiandreamer/configs/gaussiandreamer.yaml --train --gpu 0 system.prompt_processor.prompt="an amigurumi motorcycle" system.geometry.geometry_convert_from="shap-e:a motorcycle"
# MVDream + lrm initialize
python launch.py --config custom/threestudio-gaussiandreamer/configs/gaussiandreamer_mvdream.yaml --train --gpu 0 system.prompt_processor.prompt="an astronaut wearing a blue suit" system.geometry.geometry_convert_from="lrm:an astronaut"
[Notice] Different 2D diffusion guidances (SD, MVDream) and initialize methods (shap-e, lrm) can be conbined arbitrarily.
Difference with Official GaussianDreamer
- Slightly Hyper-pamarmeters differences.
- Support MVDream as 2D guidance.
- Support lrm initailzation.
Difference with threestudio-3dgs
- Faster generation speed. (Around 10min on A100).
- Add Grow&Pertb technique proposed by GaussianDreamer.
- Support lrm+MVDream initailzation instead of lrm+SDXL. Compared to SDXL, MVDream generates more appropriate front/side view images for given text prompt.
@article{GaussianDreamer,
title={GaussianDreamer: Fast Generation from Text to 3D Gaussian Splatting with Point Cloud Priors},
author={Taoran Yi and Jiemin Fang and Guanjun Wu and Lingxi Xie and Xiaopeng Zhang and Wenyu Liu and Qi Tian and Xinggang Wang},
journal={arxiv:2310.08529},
year={2023}
}