InstanceTex: Instance-level Controllable Texture Synthesis for 3D Scenes via Diffusion Priors
DescriptionAutomatically generating high-fidelity texture for a complex scene remains an open problem in computer graphics. While pioneering text-to-texture works based on 2D diffusion models have achieved fascinating results on single objects, they either suffer from style inconsistency and semantic misalignment or require extensive optimization time/memory when scaling it up to a large scene. To address these challenges, we introduce InstanceTex, a novel method to synthesize realistic and style-consistent textures for large-scale scenes. At its core, InstanceTex proposes an instance-level controllable texture synthesis approach based on an instance layout representation, which enables precise control over the instances while keeping the global style consistency. We also propose a local synchronized multi-view diffusion approach to enhance local texture consistency by sharing the latent denoised content among neighboring views in a mini-batch. Finally, tailored to scene texture mapping, we develop Neural MipTexture inspired by the Mipmaps to reduce the aliasing artifacts. Extensive texturing results on indoor and outdoor scenes show that InstanceTex produces high-quality and consistent textures with the superior quality compared to prior texture generation alternatives.
Event Type
Technical Papers
TimeWednesday, 4 December 20244:30pm - 4:41pm JST
LocationHall B5 (2), B Block, Level 5
Registration Categories
Language Formats