BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Asia/Tokyo X-LIC-LOCATION:Asia/Tokyo BEGIN:STANDARD TZOFFSETFROM:+0900 TZOFFSETTO:+0900 TZNAME:JST DTSTART:18871231T000000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20250110T023313Z LOCATION:Hall B5 (1)\, B Block\, Level 5 DTSTART;TZID=Asia/Tokyo:20241206T144500 DTEND;TZID=Asia/Tokyo:20241206T145900 UID:siggraphasia_SIGGRAPH Asia 2024_sess148_papers_384@linklings.com SUMMARY:Evaluating Visual Perception of Object Motion in Dynamic Environme nts DESCRIPTION:Technical Papers\n\nBudmonde Duinkharjav (New York University) ; Jenna Kang (Georgia Institute of Technology, New York University); Gavin Stuart Peter Miller and Chang Xiao (Adobe Research); and Qi Sun (New York University)\n\nPrecisely understanding how objects move in 3D is essentia l for broad scenarios such as video editing, gaming, driving, and athletic s. With screen-displayed computer graphics content, users only perceive li mited cues to judge the object motion from the on-screen optical flow. Con ventionally, visual perception is studied with stationary settings and sin gular objects. However, in practical applications, we---the observer---als o move within complex scenes. Therefore, we must extract object motion fro m a combined optical flow displayed on screen, which can often lead to mis -estimations due to perceptual ambiguities.\n\nWe measure and model observ ers' perceptual accuracy of object motions in dynamic 3D environments, a u niversal but under-investigated scenario in computer graphics applications . We design and employ a crowdsourcing-based psychophysical study, quantif ying the relationships among patterns of scene dynamics and content, and t he resulting perceptual judgments of object motion direction. The acquired psychophysical data underpins a model for generalized conditions. We then demonstrate the model's guidance ability to significantly enhance users' understanding of task object motion in gaming and animation design. With a pplications in measuring and compensating for object motion errors in vide o and rendering, we hope the research establishes a new frontier for under standing and mitigating perceptual errors caused by the gap between screen -displayed graphics and the physical world.\n\nRegistration Category: Full Access, Full Access Supporter\n\nLanguage Format: English Language\n\nSes sion Chair: Peng Song (Singapore University of Technology and Design (SUTD )) URL:https://asia.siggraph.org/2024/program/?id=papers_384&sess=sess148 END:VEVENT END:VCALENDAR