Neural-PBIR Reconstruction of Shape, Material, and Illumination

1Meta RLR, 2National Tsing Hua University, 3University of California, Irvine, 4University of Maryland, College Park
*Indicates Equal Contribution

ICCV 2023
teaser

Neural-PBIR recovers high-fidelity material (b1,2), shape and lighting (b3), enabling realistic re-rendering (c1-3).

News

  • Dec 12: Neural-PBIR achieves the best relighting and surface reconsturction results on Stanford-ORB dataset!
  • Dec 12: Arxiv paper updated.

Abstract

Reconstructing the shape and spatially varying surface appearances of a physical-world object as well as its surrounding illumination based on 2D images (e.g., photographs) of the object has been a long-standing problem in computer vision and graphics. In this paper, we introduce a robust object reconstruction pipeline combining neural based object reconstruction and physics-based inverse rendering (PBIR). Specifically, our pipeline firstly leverages a neural stage to produce high-quality but potentially imperfect predictions of object shape, reflectance, and illumination. Then, in the later stage, initialized by the neural predictions, we perform PBIR to refine the initial results and obtain the final high-quality reconstruction. Experimental results demonstrate our pipeline significantly outperforms existing reconstruction methods quality-wise and performance-wise.

Our Pipeline

pipeline

Our pipeline is comprised of three main stages. The first stage is a fast and precise surface reconstruction step that brings direct SDF grid optimization into NeuS. Associated with this surface is an overfitted radiance field that does not fully model the surface reflectance of the object. Our second stage is an efficient neural distillation method that converts the radiance fields to physics-based reflectance and illumination models. Lastly, our third stage utilizes physics-based inverse rendering (PBIR) to further refine the object geometry and reflectance reconstructed by the first two stages. This stage leverages physics-based differentiable rendering that captures global illumination (GI) effects such as soft shadows and interreflection.

Novel views and illuminations rendering

Object

Novel Lighting

Rotation

Comparison

BibTeX

@article{sun2023neuralpbir,
  author    = {Cheng Sun, Guangyan Cai, Zhengqin Li, Kai Yan, Cheng Zhang, Carl Marshall, Jia-Bin Huang, Shuang Zhao, Zhao Dong},
  title     = {Neural-PBIR Reconstruction of Shape, Material, and Illumination},
  journal   = {arxiv},
  year      = {2023},
}