Thursday May 18, 2023
CVPR 2023 - Balanced Spherical Grid for Egocentric View Synthesis
In this episode we discuss Balanced Spherical Grid for Egocentric View Synthesis by Changwoon Choi, Sang Min Kim, Young Min Kim. The paper presents EgoNeRF, an efficient solution for reconstructing large-scale environments from a few seconds of 360 videos for virtual reality (VR) assets. The authors adopted a spherical coordinate parameterization instead of Cartesian coordinate grids, which tend to be inefficient for unbounded scenes. This approach aligns better with egocentric images' rays and also enables factorization for performance enhancement. Additionally, the authors use resampling techniques and a combination of balanced grids to avoid singularities and represent unbounded scenes respectively. They extensively evaluate their approach with synthetic and real-world egocentric 360 video datasets and report state-of-the-art performance consistently.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.