BEGIN:VCALENDAR VERSION:2.0 PRODID:Linklings LLC BEGIN:VTIMEZONE TZID:Australia/Melbourne X-LIC-LOCATION:Australia/Melbourne BEGIN:DAYLIGHT TZOFFSETFROM:+1000 TZOFFSETTO:+1100 TZNAME:AEDT DTSTART:19721003T020000 RRULE:FREQ=YEARLY;BYMONTH=4;BYDAY=1SU END:DAYLIGHT BEGIN:STANDARD DTSTART:19721003T020000 TZOFFSETFROM:+1100 TZOFFSETTO:+1000 TZNAME:AEST RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=1SU END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20240214T070244Z LOCATION:Meeting Room C4.11\, Level 4 (Convention Centre) DTSTART;TZID=Australia/Melbourne:20231213T140000 DTEND;TZID=Australia/Melbourne:20231213T141500 UID:siggraphasia_SIGGRAPH Asia 2023_sess128_papers_321@linklings.com SUMMARY:Neural Field Convolutions by Repeated Differentiation DESCRIPTION:Technical Papers\n\nNtumba Elie Nsampi, Adarsh Djeacoumar, and Hans-Peter Seidel (Max-Planck-Institut für Informatik); Tobias Ritschel ( University College London (UCL)); and Thomas Leimkühler (Max-Planck-Instit ut für Informatik)\n\nNeural fields are evolving towards a general-purpose continuous representation for visual computing. Yet, despite their numero us appealing properties, they are hardly amenable to signal processing. As a remedy, we present a method to perform general continuous convolutions with general continuous signals such as neural fields. Observing that piec ewise polynomial kernels reduce to a sparse set of Dirac deltas after repe ated differentiation, we leverage convolution identities and train a repea ted integral field to efficiently execute large-scale convolutions. We dem onstrate our approach on a variety of data modalities and spatially-varyin g kernels.\n\nRegistration Category: Full Access\n\nSession Chair: Jianfei Cai (Monash University) URL:https://asia.siggraph.org/2023/full-program?id=papers_321&sess=sess128 END:VEVENT END:VCALENDAR