Show simple item record

dc.contributor.authorAlexander, Emma
dc.date.accessioned2019-08-08T09:11:44Z
dc.date.created2019-03
dc.date.issued2019-01-25
dc.date.submitted2019
dc.identifier.citationAlexander, Emma. 2019. A Theory of Depth From Differential Defocus. Doctoral dissertation, Harvard University, Graduate School of Arts & Sciences.
dc.identifier.urihttp://nrs.harvard.edu/urn-3:HUL.InstRepos:41121273*
dc.description.abstractThis document describes a class of computationally-efficient visual depth sensors. Inspired by the visual system of the jumping spider, these sensors are thin lens cameras that observe small changes in optical defocus. Differential defocus changes can be generated by small changes in camera parameters such as aperture size, lens or photosensor location, optical power, camera position, or a combination of these parameters. The defocus brightness constancy constraint describes the resulting differential change in image values as a weighted sum of the spatial derivatives of the blurred image, where the weights hold information of interest and no knowledge of the underlying scene's texture or geometry is required. This equation is a linear constraint on locally computed image values, so that solving it provides a highly efficient method for recovering depth and other scene information such as velocity. This dissertation introduces the defocus brightness constancy constraint and describes its usefulness with regard to scene geometry and image content, both analytically and in practice. It also proves the constraint's uniqueness as the only texture-independent linear constraint on local image quantities for differential defocus in a coded aperture camera. The "aperture code" required for the constraint to hold exactly is Gaussian blur, and the method is shown in practice to be robust to nonidealities including non-Gaussian blur, regions with low signal-to-noise ratio, and optically complicated scenes involving reflective and transparent objects. This robustness is demonstrated with a pair of depth from differential defocus sensor prototypes. The first is a standard unactuated camera that observes a moving scene and provides patch-wise depth and velocity measurements. The second contains a lens with electronically-adjustable optical power, and produces per-pixel depth and confidence measurements at 100 frames per second on a laptop GPU.
dc.description.sponsorshipEngineering and Applied Sciences - Computer Science
dc.format.mimetypeapplication/pdf
dc.language.isoen
dash.licenseLAA
dc.subjectcomputer vision
dc.subjectdepth sensing
dc.subjectdepth from defocus
dc.subjectoptics
dc.subjectcomputational photography
dc.subjectjumping spider
dc.titleA Theory of Depth From Differential Defocus
dc.typeThesis or Dissertation
dash.depositing.authorAlexander, Emma
dc.date.available2019-08-08T09:11:44Z
thesis.degree.date2019
thesis.degree.grantorGraduate School of Arts & Sciences
thesis.degree.grantorGraduate School of Arts & Sciences
thesis.degree.levelDoctoral
thesis.degree.levelDoctoral
thesis.degree.nameDoctor of Philosophy
thesis.degree.nameDoctor of Philosophy
dc.contributor.committeeMemberZickler, Todd
dc.contributor.committeeMemberGortler, Steven J.
dc.contributor.committeeMemberHorn, Berthold
dc.type.materialtext
thesis.degree.departmentEngineering and Applied Sciences - Computer Science
thesis.degree.departmentEngineering and Applied Sciences - Computer Science
dash.identifier.vireo
dash.author.emailalexanderemmab@gmail.com


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record