BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:Australia/Melbourne
X-LIC-LOCATION:Australia/Melbourne
BEGIN:DAYLIGHT
TZOFFSETFROM:+1000
TZOFFSETTO:+1100
TZNAME:AEDT
DTSTART:19721003T020000
RRULE:FREQ=YEARLY;BYMONTH=4;BYDAY=1SU
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:19721003T020000
TZOFFSETFROM:+1100
TZOFFSETTO:+1000
TZNAME:AEST
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260114T163644Z
LOCATION:Meeting Room C4.9+C4.10\, Level 4 (Convention Centre)
DTSTART;TZID=Australia/Melbourne:20231213T161000
DTEND;TZID=Australia/Melbourne:20231213T162500
UID:siggraphasia_SIGGRAPH Asia 2023_sess168_papers_269@linklings.com
SUMMARY:Amortizing Samples in Physics-Based Inverse Rendering using ReSTIR
DESCRIPTION:Yu-Chen Wang (University of California Irvine), Chris Wyman an
 d Lifan Wu (NVIDIA), and Shuang Zhao (University of California Irvine)\n\n
 Recently, great progress has been made in physics-based differentiable ren
 dering. Existing differentiable rendering techniques typically focus on st
 atic scenes, but during inverse rendering—a key application for differenti
 able rendering—the scene is updated dynamically by each gradient step. In 
 this paper, we take a first step to leverage temporal data in the context 
 of inverse direct illumination. By adopting reservoir-based spatiotemporal
  resampled importance resampling (ReSTIR), we introduce new Monte Carlo es
 timators for both interior and boundary components of differential direct 
 illumination integrals. We also integrate ReSTIR with antithetic sampling 
 to further improve its effectiveness. At equal frame time, our methods pro
 duce gradient estimates with up to 100× lower relative error than baseline
  methods. Additionally, we propose an inverse-rendering pipeline that inco
 rporates these estimators and provides reconstructions with up to 20× lowe
 r error.\n\nRegistration Category: Full Access\n\nSession Chair: Soo-Mi Ch
 oi (Sejong University)\n\n
URL:https://asia.siggraph.org/2023/full-program?id=papers_269&sess=sess168
END:VEVENT
END:VCALENDAR
