Skip to content

Latest commit

 

History

History
76 lines (47 loc) · 2.77 KB

README.md

File metadata and controls

76 lines (47 loc) · 2.77 KB
If you like our project, please give us a star ⭐ on GitHub for latest update.

webpage arXiv License: MIT

image

😮 Highlights

🔥 Generation-Reconstruction cycle for the unified diffusion process

  • The pre-trained 2D diffusion model trained on billions of web images can generate high-quality texture.
  • The reconstruction model can ensure consistency across multi-views.
  • We cyclically utilizes a 2D diffusion-based generation module and a feed-forward 3D reconstruction module during the multi-step diffusion process.

🚩 Updates

Welcome to watch 👀 this repository for the latest updates.

[2024.7.28] : We have released our paper, Cycle3D on arXiv.

[2024.7.28] : Release project page.

  • Code release.
  • Online Demo.

🤗 Demo

Coming soon!

🚀 Image-to-3D Results

Qualitative comparison

image

Quantitative comparison

image

👍 Acknowledgement

This work is built on many amazing research works and open-source projects, thanks a lot to all the authors for sharing!

✏️ Citation

If you find our paper and code useful in your research, please consider giving a star ⭐ and citation 📝.

@misc{tang2024cycle3dhighqualityconsistentimageto3d,
      title={Cycle3D: High-quality and Consistent Image-to-3D Generation via Generation-Reconstruction Cycle}, 
      author={Zhenyu Tang and Junwu Zhang and Xinhua Cheng and Wangbo Yu and Chaoran Feng and Yatian Pang and Bin Lin and Li Yuan},
      year={2024},
      eprint={2407.19548},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2407.19548}, 
}