BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Tokyo X-LIC-LOCATION:Asia/Tokyo BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:JST DTSTART:18871231T000000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20250110T023312Z LOCATION:Hall B5 (1)\, B Block\, Level 5 DTSTART;TZID=Asia/Tokyo:20241204T145900 DTEND;TZID=Asia/Tokyo:20241204T151300 UID:siggraphasia_SIGGRAPH Asia 2024_sess118_papers_671@linklings.com SUMMARY:FabricDiffusion: High-Fidelity Texture Transfer for 3D Garments Ge neration from In-The-Wild Images DESCRIPTION:Technical Papers\n\nCheng Zhang (Carnegie Mellon University, T exas A&M University); Yuanhao Wang and Francisco Vicente (Carnegie Mellon University); Chenglei Wu, Jinlong Yang, and Thabo Beeler (Google Inc.); an d Fernando De la Torre (Carnegie Mellon University)\n\nWe introduce Fabric Diffusion, a method for transferring fabric textures from a single clothin g image to 3D garments of arbitrary shapes. Existing approaches typically synthesize textures on the garment surface through 2D-to-3D texture mappin g or depth-aware inpainting via generative models. Unfortunately, these me thods often struggle to capture and preserve texture details, particularly due to challenging occlusions, distortions, or poses in the input image. Inspired by the observation that in the fashion industry, most garments ar e constructed by stitching sewing patterns with flat, repeatable textures, we cast the task of clothing texture transfer as extracting distortion-fr ee, tileable texture materials that are subsequently mapped onto the UV sp ace of the garment. Building upon this insight, we train a denoising diffu sion model with a large-scale synthetic dataset to rectify distortions in the input texture image. This process yields a flat texture map that enabl es a tight coupling with existing Physically-Based Rendering (PBR) materia l generation pipelines, allowing for realistic relighting of the garment u nder various lighting conditions. We show that FabricDiffusion can transfe r various features from a single clothing image including texture patterns , material properties, and detailed prints and logos. Extensive experiment s demonstrate that our model significantly outperforms state-to-the-art me thods on both synthetic data and real-world, in-the-wild clothing images w hile generalizing to unseen textures and garment shapes.\n\nRegistration C ategory: Full Access, Full Access Supporter\n\nLanguage Format: English La nguage\n\nSession Chair: Meng Zhang (Nanjing University of Science and Tec hnology) URL:https://asia.siggraph.org/2024/program/?id=papers_671&sess=sess118 END:VEVENT END:VCALENDAR