Boosting 3D Object Generation through PBR Materials
DescriptionAutomatic 3D content creation has gained increasing attention recently, due to its potential in various applications such as video games, film industry, and AR/VR.
Recent advancements in diffusion models and multimodal models have notably improved the quality and efficiency of 3D object generation given a single RGB image.
However, 3D objects generated even by state-of-the-art methods are still unsatisfactory compared to human-created assets.
Considering only textures instead of materials makes these methods encounter challenges in photo-realistic rendering, relighting, and flexible appearance editing,
they also suffer from severe misalignment between geometry and high-frequency texture details.
In this work, we propose a novel approach to boost the quality of generated 3D objects from the perspective of Physics-Based Rendering (PBR) materials.
By analyzing the components of PBR materials,
we choose to consider albedo, roughness, metalness, and normal bump in a single image to 3D object generation.
For albedo and normal-bump,
we leverage Stable Diffusion fine-tuned on synthetic data to extract these values,
with novel usages of these fine-tuned models to obtain 3D consistent albedo uv and normal-bump uv for generated objects.
In terms of roughness and metalness,
we adopt a semi-automatic process to provide room for interactive adjustment, which we believe is more practical.
Extensive experiments demonstrate that our model is generally beneficial for various state-of-the-art generation methods, significantly boosting the quality and realism of their generated 3D objects, with natural relighting effects and substantially improved geometry.
Event Type
Technical Papers
TimeFriday, 6 December 20242:45pm - 2:56pm JST
LocationHall B5 (2), B Block, Level 5
Registration Categories
Language Formats