- > 연구성과 > 국외논문
공지사항
논문명 |
Neural Rendering-Based 3D Scene Style Transfer Method via Semantic Understanding Using a Single Style Image |
논문종류 |
SCI |
저자 |
Jisun Park, Kyungeun Cho |
Impact Factor |
2.592 |
게재학술지명 |
Mathematics |
게재일 |
2023.07 |
In the rapidly emerging era of untact (“contact-free”) technologies, the requirement for
three-dimensional (3D) virtual environments utilized in virtual reality (VR)/augmented reality
(AR) and the metaverse has seen significant growth, owing to their extensive application across
various domains. Current research focuses on the automatic transfer of the style of rendering
images within a 3D virtual environment using artificial intelligence, which aims to minimize human
intervention. However, the prevalent studies on rendering-based 3D environment-style transfers
have certain inherent limitations. First, the training of a style transfer network dedicated to 3D virtual
environments demands considerable style image data. These data must align with viewpoints that
closely resemble those of the virtual environment. Second, there was noticeable inconsistency within
the 3D structures. Predominant studies often neglect 3D scene geometry information instead of
relying solely on 2D input image features. Finally, style adaptation fails to accommodate the unique
characteristics inherent in each object. To address these issues, we propose a novel approach: a
neural rendering-based 3D scene-style conversion technique. This methodology employs semantic
nearest-neighbor feature matching, thereby facilitating the transfer of style within a 3D scene while
considering the distinctive characteristics of each object, even when employing a single style image.
The neural radiance field enables the network to comprehend the geometric information of a 3D scene
in relation to its viewpoint. Subsequently, it transfers style features by employing the unique features
of a single style image via semantic nearest-neighbor feature matching. In an empirical context, our
proposed semantic 3D scene style transfer method was applied to 3D scene style transfers for both
interior and exterior environments. This application utilizes the replica, 3DFront, and Tanks and
Temples datasets for testing. The results illustrate that the proposed methodology surpasses existing
style transfer techniques in terms of maintaining 3D viewpoint consistency, style uniformity, and
semantic coherence. |
|