CVPR 2024 Oral paper
NeRF-HuGS: Improved Neural Radiance Fields in Non-static Scenes Using Heuristics-Guided Segmentation
Jiahao Chen, Yipeng Qin, Lingjie Liu, Jiangbo Lu, Guanbin Li
CVPR 2024 Oral paper

Abstract


Neural Radiance Field (NeRF) has been widely recognized for its excellence in novel view synthesis and 3D scene reconstruction. However, their effectiveness is inherently tied to the assumption of static scenes, rendering them susceptible to undesirable artifacts when confronted with transient distractors such as moving objects or shadows. In this work, we propose a novel paradigm, namely “Heuristics-Guided Segmentation” (HuGS), which significantly enhances the separation of static scenes from transient distractors by harmoniously combining the strengths of hand-crafted heuristics and state-of-the-art segmentation models, thus significantly transcending the limitations of previous solutions. Furthermore, we delve into the meticulous design of heuristics, introducing a seamless fusion of Structure-from-Motion (SfM)-based heuristics and color residual heuristics, catering to a diverse range of texture profiles. Extensive experiments demonstrate the superiority and robustness of our method in mitigating transient distractors for NeRFs trained in non-static scenes. Project page: https://cnhaox.github.io/NeRF-HuGS/

 

 

Framework


 

 

Experiment


 

 

Conclusion


In this work, we propose a novel heuristics-guided segmentation paradigm that effectively addresses the prevalent issue of transient distractors in real-world NeRF training. By strategically combining the complementary strengths of hand-crafted heuristics and state-of-the-art semantic segmentation models, our method achieves highly accurate segmentation of transient distractors across diverse scenes without any prior knowledge. Through meticulous heuristic design, our method can capture both high and low-frequency static scene elements robustly. Extensive experiments demonstrate the superiority of our approach over existing methods.