Thin On-Sensor Nanophotonic Array Cameras
SessionMultidisciplinary Fusion
DescriptionToday's commodity camera systems rely on compound optical systems to map light originating from the scene to positions on the sensor where it gets recorded as an image. To achieve an accurate mapping without optical aberrations, i.e., deviations from Gauss' linear optics model, typical lens systems introduce increasingly complex stacks of optical elements responsible for the height of existing commodity cameras. In this work, we investigate flat nanophotonic computational cameras as an alternative that employs an array of skewed lenslets and a learned reconstruction method. The optical array is embedded in a metasurface that, at 700nm height, is flat and sits on the sensor cover glass at a 2mm focal distance from the sensor. To tackle the highly chromatic response of a metasurface and design an array over the entire sensor, we propose a differentiable optimization method that samples continuously over the spectrum and that factorizes the optical modulation for different optical fields into individual lenses of an array. We reconstruct a megapixel image from our flat imager with a learned probabilistic reconstruction method that employs a generative diffusion model to sample an implicit prior. To tackle scene-dependent aberrations in broadband, we propose a method for acquiring paired real-world training data in diverse illumination conditions. We assess the proposed flat camera design in simulation and with an experimental prototype, validating that the method is capable of recovering high-quality images outside the lab in broadband with a single flat metasurface optic.
Event Type
Technical Papers
TimeThursday, 14 December 20233:55pm - 4:10pm
LocationMeeting Room C4.8, Level 4 (Convention Centre)