BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:Australia/Melbourne
X-LIC-LOCATION:Australia/Melbourne
BEGIN:DAYLIGHT
TZOFFSETFROM:+1000
TZOFFSETTO:+1100
TZNAME:AEDT
DTSTART:19721003T020000
RRULE:FREQ=YEARLY;BYMONTH=4;BYDAY=1SU
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:19721003T020000
TZOFFSETFROM:+1100
TZOFFSETTO:+1000
TZNAME:AEST
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260114T163658Z
LOCATION:Meeting Room C4.11\, Level 4 (Convention Centre)
DTSTART;TZID=Australia/Melbourne:20231215T154000
DTEND;TZID=Australia/Melbourne:20231215T155000
UID:siggraphasia_SIGGRAPH Asia 2023_sess139_papers_403@linklings.com
SUMMARY:Efficient Human Motion Reconstruction from Monocular Videos with P
 hysical Consistency Loss
DESCRIPTION:Lin Cong and Philipp Ruppel (Universität Hamburg), Yizhou Wang
  (Peking University), Xiang Pan (Diago Tech Company), and Norman Hendrich 
 and Jianwei Zhang (Universität Hamburg)\n\nVision-only motion reconstructi
 on from monocular videos often produces artifacts such as foot sliding and
  jittery motions. Existing physics-based methods typically either simplify
  the problem to focus solely on feet-ground contacts, or reconstruct full-
 body contacts within a physics simulator, necessitating the solution of a 
 time-consuming bilevel optimization problem. \nTo overcome these limitatio
 ns, we present an efficient gradient-based method for reconstructing compl
 ex human motions (including highly dynamic and acrobatic movements) with p
 hysical constraints. Our approach reformulates human motion dynamics throu
 gh a differentiable physical consistency loss within an augmented search s
 pace that accounts both for contacts and the camera alignment. This enable
 s us to transform the motion reconstruction task into a single-level traje
 ctory optimization problem. Experimental results demonstrate that our meth
 od can reconstruct complex human motions from real-world videos in minutes
 , which is substantially faster than previous approaches. Additionally, th
 e reconstructed results show enhanced physical realism compared to existin
 g methods.\n\nRegistration Category: Full Access\n\nSession Chair: Yuting 
 Ye (Reality Labs Research, Meta; Meta)\n\n
URL:https://asia.siggraph.org/2023/full-program?id=papers_403&sess=sess139
END:VEVENT
END:VCALENDAR
