Learning to exploit stability for 3D scene parsing

Yilun Du, Zhijian Liu, Hector Basevi, Ales Leonardis, William T. Freeman, Joshua T. Tenenbaum, Jiajun Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

41 Downloads (Pure)


Human scene understanding uses a variety of visual and non-visual cues to perform inference on object types, poses, and relations. Physics is a rich and universal cue that we exploit to enhance scene understanding. In this paper, we integrate the physical cue of stability into the learning process by looping in a physics engine into bottom-up recognition models, and apply it to the problem of 3D scene parsing. We first show that applying physics supervision to an existing scene understanding model increases performance, produces more stable predictions, and allows training to an equivalent performance level with fewer annotated training examples. We then present a novel architecture for 3D scene parsing named Prim R-CNN, learning to predict bounding boxes as well as their 3D size, translation, and rotation. With physics supervision, Prim R-CNN outperforms existing scene understanding approaches on this problem. Finally, we show that finetuning with physics supervision on unlabeled real images improves real domain transfer of models training on synthetic data.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 31 (NIPS 2018)
EditorsS. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, R. Garnett
Number of pages11
Publication statusE-pub ahead of print - 2 Dec 2018
Event32nd Conference on Neural Information Processing Systems (NIPS 2018) - Vancouver, Canada
Duration: 2 Dec 20188 Dec 2018

Publication series

NameElectronic Proceedings of the Neural Information Processing Systems Conference


Conference32nd Conference on Neural Information Processing Systems (NIPS 2018)


Dive into the research topics of 'Learning to exploit stability for 3D scene parsing'. Together they form a unique fingerprint.

Cite this