I-Scene: 3D Instance Models are Implicit Generalizable Spatial Learners
🌟 CVPR 2026
🏠 Project Page | 📄 Paper | 🤗 Model | 📦 Dataset | 🎮 Online Demo
Lu Ling1, Yunhao Ge2, Yichen Sheng2, Aniket Bera1
1Purdue University 2NVIDIA Research
🌟 Overview
I-Scene reprograms a pre-trained 3D instance generator to act as a scene-level learner, replacing dataset-bounded supervision with model-centric spatial supervision. This unlocks the generator's transferable spatial knowledge, enabling generalization to unseen layouts and novel object compositions.
🔑 Key Features
- Model Flexibility: A pre-trained 3D instance generator can be directly reprogrammed as a scene-level spatial learner, without scene-level annotations.
- Transferable Spatial Prior: The reprogrammed model's spatial prior provides a rich learning signal for inferring proximity, support, and symmetry from purely geometric cues.
- Data Independence: The model learns spatial knowledge on non-semantic scenes from randomly composed objects, removing dependency on annotated data.
- Strong Generalizability: It allows for easy generalization to unseen layouts and various spatial relations in a feed-forward manner without per-scene optimization.
🔥 Updates
- Release inference code and sparse structure flow transformer
- Release interactive huggingface demo and usage
- Release training data scripts
- Release evaluation code
🚀 Demo
Visit our Project Page for:
- Interactive 3D scene visualization
- Comparison with state-of-the-art methods
- More visualization examples
📦 Installation
# Coming soon git clone https://github.com/LuLing06/I-Scene-project.git cd I-Scene-project
🎯 Usage
📜 Citation
If you find this work helpful, please consider citing our paper:
@article{ling2025iscene, title={I-Scene: 3D Instance Models are Implicit Generalizable Spatial Learners}, author={Ling, Lu and Ge, Yunhao and Sheng, Yichen and Bera, Aniket}, journal={arXiv preprint arXiv:2512.13683}, year={2025} }
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
We thank the authors of TRELLIS, and other related works for their inspiring research.
