dc.contributor.author | Alexander, Emma | |
dc.date.accessioned | 2019-08-08T09:11:44Z | |
dc.date.created | 2019-03 | |
dc.date.issued | 2019-01-25 | |
dc.date.submitted | 2019 | |
dc.identifier.citation | Alexander, Emma. 2019. A Theory of Depth From Differential Defocus. Doctoral dissertation, Harvard University, Graduate School of Arts & Sciences. | |
dc.identifier.uri | http://nrs.harvard.edu/urn-3:HUL.InstRepos:41121273 | * |
dc.description.abstract | This document describes a class of computationally-efficient visual depth sensors. Inspired by the visual system of the jumping spider, these sensors are thin lens cameras that observe small changes in optical defocus. Differential defocus changes can be generated by small changes in camera parameters such as aperture size, lens or photosensor location, optical power, camera position, or a combination of these parameters.
The defocus brightness constancy constraint describes the resulting differential change in image values as a weighted sum of the spatial derivatives of the blurred image, where the weights hold information of interest and no knowledge of the underlying scene's texture or geometry is required. This equation is a linear constraint on locally computed image values, so that solving it provides a highly efficient method for recovering depth and other scene information such as velocity.
This dissertation introduces the defocus brightness constancy constraint and describes its usefulness with regard to scene geometry and image content, both analytically and in practice. It also proves the constraint's uniqueness as the only texture-independent linear constraint on local image quantities for differential defocus in a coded aperture camera. The "aperture code" required for the constraint to hold exactly is Gaussian blur, and the method is shown in practice to be robust to nonidealities including non-Gaussian blur, regions with low signal-to-noise ratio, and optically complicated scenes involving reflective and transparent objects. This robustness is demonstrated with a pair of depth from differential defocus sensor prototypes. The first is a standard unactuated camera that observes a moving scene and provides patch-wise depth and velocity measurements. The second contains a lens with electronically-adjustable optical power, and produces per-pixel depth and confidence measurements at 100 frames per second on a laptop GPU. | |
dc.description.sponsorship | Engineering and Applied Sciences - Computer Science | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dash.license | LAA | |
dc.subject | computer vision | |
dc.subject | depth sensing | |
dc.subject | depth from defocus | |
dc.subject | optics | |
dc.subject | computational photography | |
dc.subject | jumping spider | |
dc.title | A Theory of Depth From Differential Defocus | |
dc.type | Thesis or Dissertation | |
dash.depositing.author | Alexander, Emma | |
dc.date.available | 2019-08-08T09:11:44Z | |
thesis.degree.date | 2019 | |
thesis.degree.grantor | Graduate School of Arts & Sciences | |
thesis.degree.grantor | Graduate School of Arts & Sciences | |
thesis.degree.level | Doctoral | |
thesis.degree.level | Doctoral | |
thesis.degree.name | Doctor of Philosophy | |
thesis.degree.name | Doctor of Philosophy | |
dc.contributor.committeeMember | Zickler, Todd | |
dc.contributor.committeeMember | Gortler, Steven J. | |
dc.contributor.committeeMember | Horn, Berthold | |
dc.type.material | text | |
thesis.degree.department | Engineering and Applied Sciences - Computer Science | |
thesis.degree.department | Engineering and Applied Sciences - Computer Science | |
dash.identifier.vireo | | |
dash.author.email | alexanderemmab@gmail.com | |