BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Australia/Melbourne X-LIC-LOCATION:Australia/Melbourne BEGIN:DAYLIGHT TZOFFSETFROM:+1000 TZOFFSETTO:+1100 TZNAME:AEDT DTSTART:19721003T020000 RRULE:FREQ=YEARLY;BYMONTH=4;BYDAY=1SU END:DAYLIGHT BEGIN:STANDARD DTSTART:19721003T020000 TZOFFSETFROM:+1100 TZOFFSETTO:+1000 TZNAME:AEST RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=1SU END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20240214T070250Z LOCATION:Meeting Room C4.11\, Level 4 (Convention Centre) DTSTART;TZID=Australia/Melbourne:20231215T154000 DTEND;TZID=Australia/Melbourne:20231215T155000 UID:siggraphasia_SIGGRAPH Asia 2023_sess139_papers_403@linklings.com SUMMARY:Efficient Human Motion Reconstruction from Monocular Videos with P hysical Consistency Loss DESCRIPTION:Technical Papers\n\nLin Cong and Philipp Ruppel (Universität H amburg), Yizhou Wang (Peking University), Xiang Pan (Diago Tech Company), and Norman Hendrich and Jianwei Zhang (Universität Hamburg)\n\nVision-only motion reconstruction from monocular videos often produces artifacts such as foot sliding and jittery motions. Existing physics-based methods typic ally either simplify the problem to focus solely on feet-ground contacts, or reconstruct full-body contacts within a physics simulator, necessitatin g the solution of a time-consuming bilevel optimization problem. \nTo over come these limitations, we present an efficient gradient-based method for reconstructing complex human motions (including highly dynamic and acrobat ic movements) with physical constraints. Our approach reformulates human m otion dynamics through a differentiable physical consistency loss within a n augmented search space that accounts both for contacts and the camera al ignment. This enables us to transform the motion reconstruction task into a single-level trajectory optimization problem. Experimental results demon strate that our method can reconstruct complex human motions from real-wor ld videos in minutes, which is substantially faster than previous approach es. Additionally, the reconstructed results show enhanced physical realism compared to existing methods.\n\nRegistration Category: Full Access\n\nSe ssion Chair: Yuting Ye (Reality Labs Research, Meta) URL:https://asia.siggraph.org/2023/full-program?id=papers_403&sess=sess139 END:VEVENT END:VCALENDAR