BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:Asia/Tokyo
X-LIC-LOCATION:Asia/Tokyo
BEGIN:STANDARD
TZOFFSETFROM:+0900
TZOFFSETTO:+0900
TZNAME:JST
DTSTART:18871231T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20250110T023312Z
LOCATION:Hall B7 (1)\, B Block\, Level 7
DTSTART;TZID=Asia/Tokyo:20241204T131100
DTEND;TZID=Asia/Tokyo:20241204T132300
UID:siggraphasia_SIGGRAPH Asia 2024_sess117_papers_277@linklings.com
SUMMARY:High-quality Animatable Eyelid Shapes from Lightweight Captures
DESCRIPTION:Technical Papers\n\nJunfeng Lyu and Feng Xu (Tsinghua Universi
 ty, China)\n\nHigh-quality eyelid reconstruction and animation are challen
 ging for the subtle details and complicated deformations. Previous works u
 sually suffer from the trade-off between the capture costs and the quality
  of details. In this paper, we propose a novel method that can achieve det
 ailed eyelid reconstruction and animation by only using an RGB video captu
 red by a mobile phone. Our method utilizes both static and dynamic informa
 tion of eyeballs (e.g., positions and rotations) to assist the eyelid reco
 nstruction, cooperating with an automatic eyeball calibration method to ge
 t the required eyeball parameters. Furthermore, we develop a neural eyelid
  control module to achieve the semantic animation control of eyelids. To t
 he best of our knowledge, we present the first method for high-quality eye
 lid reconstruction and animation from lightweight captures. Extensive expe
 riments on both synthetic and real data show that our method can provide m
 ore detailed and realistic results compared with previous methods based on
  the same-level capture setups. The code is available at https://github.co
 m/StoryMY/AniEyelid.\n\nRegistration Category: Full Access, Full Access Su
 pporter\n\nLanguage Format: English Language\n\nSession Chair: Jungdam Won
  (Seoul National University)
URL:https://asia.siggraph.org/2024/program/?id=papers_277&sess=sess117
END:VEVENT
END:VCALENDAR
