View Synthesis in Tidal Flat Environments with Spherical Harmonics and Neighboring Views Integration
-
Graphical Abstract
-
Abstract
We present a novel view synthesis method that introduces radial field representation of density and tidal flat appearance in neural rendering. Our method aims to generate realistic images from new viewpoints by utilizing continuous scene information generated from different sampling points on a set of identical rays. This approach significantly improves rendering quality and reduces blurring and aliasing artifacts compared to existing techniques such as Nerfacto. Our model employs the spherical harmonic function to efficiently encode viewpoint orientation infor-mation and integrates image features from neighboring viewpoints for enhanced fusion. This results in an accurate and detailed reconstruction of the scene’s geometry and appearance.We evaluate our approach on publicly available datasets containing a variety of indoor and outdoor scenes, as well as on customized tidal flats datasets. The results show that our algorithm outperforms Nerfacto in terms of PSNR, SSIM, and LPIPS metrics, demonstrating superior performance in both complex and simple environments. This study emphasizes the potential of our approach in advancing view synthesis techniques and provides a powerful tool for environmental research and conservation efforts in dynamic ecosystems such as mudflats. Future work will focus on further optimizations and extensions to improve the efficiency and quality of the rendering process.
-
-