BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Tokyo X-LIC-LOCATION:Asia/Tokyo BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:JST DTSTART:18871231T000000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20250110T023312Z LOCATION:Hall B7 (1)\, B Block\, Level 7 DTSTART;TZID=Asia/Tokyo:20241205T113100 DTEND;TZID=Asia/Tokyo:20241205T114300 UID:siggraphasia_SIGGRAPH Asia 2024_sess129_papers_363@linklings.com SUMMARY:EgoHDM: An Online Egocentric-Inertial Human Motion Capture, Locali zation, and Dense Mapping System DESCRIPTION:Technical Papers\n\nHandi Yin and Bonan Liu (Hong Kong Univers ity of Science and Technology, Guangzhou); Manuel Kaufmann (ETH Zürich); J inhao He (Hong Kong University of Science and Technology, Guangzhou); Samm y Christen (ETH Zürich); and Jie Song and Pan Hui (Hong Kong University of Science and Technology, Guangzhou; Hong Kong University of Science and Te chnology)\n\nWe present EgoHDM, an online egocentric-inertial human motion capture (mocap), localization, and dense mapping system. Our system uses 6 inertial measurement units (IMUs) and a commodity head-mounted RGB camer a. EgoHDM is the first human mocap system that offers dense scene mapping in near real-time. Further, it is fast and robust to initialize and fully closes the loop between physically plausible map-aware global human motion estimation and mocap-aware 3D scene reconstruction. Our key idea is integ rating camera localization and mapping information with inertial human mot ion capture bidirectionally in our system. To achieve this, we design a ti ghtly coupled mocap-aware dense bundle adjustment and physics-based body p ose correction module leveraging a local body-centric elevation map. The l atter introduces a novel terrain-aware contact PD controller, which enable s characters to physically contact the given local elevation map thereby r educing human floating or penetration. We demonstrate the performance of o ur system on established synthetic and real-world benchmarks. The results show that our method reduces human localization, camera pose, and mapping accuracy error by 41%, 71%, 46%, respectively, compared to the state of th e art. Our qualitative evaluations on newly captured data further demonstr ate that EgoHDM can cover challenging scenarios in non-flat terrain includ ing stepping over stairs and outdoor scenes in the wild.\n\nRegistration C ategory: Full Access, Full Access Supporter\n\nLanguage Format: English La nguage\n\nSession Chair: Yuting Ye (Reality Labs Research, Meta; Meta) URL:https://asia.siggraph.org/2024/program/?id=papers_363&sess=sess129 END:VEVENT END:VCALENDAR