Publication:
Reachable environments in the mind and brain: insights into the visual representation of near-scale spaces

No Thumbnail Available

Date

2021-05-12

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Josephs, Emilie Louise. 2021. Reachable environments in the mind and brain: insights into the visual representation of near-scale spaces. Doctoral dissertation, Harvard University Graduate School of Arts and Sciences.

Research Data

Abstract

Much of our visual experience consists of rich, close-scale environments: imagine the view of your desk as you type an email, or the kitchen counter as you prepare a meal. Vision science has identified mechanisms for processing views of individual objects and views of navigable-scale environments (i.e. “scenes”), but it is still unknown what mechanisms underlie our understanding of reachable-scale spaces (hereafter “reachspaces”). Across three papers, I explore how reachspaces are represented in the mind and brain, and provide evidence that they may require additional mechanisms distinct from object and scene processing. In the first paper (Josephs & Konkle, 2019), I tested whether views of reachspaces differ systematically from views of scenes and objects in their visual statistics. Using computational measures, I found that reachspaces span a distinct set of visual features from scenes and objects. With behavioral experiments, I confirmed that human observers are sensitive to these differences, and that even across a wide variety of categories (e.g. office, bathroom, kitchen, etc), reachspaces are more perceptually similar to each other than to views at other scales. In the second paper, I explored the organization of our knowledge of the reachable world. I collected over 1 million similarity judgments on reachspace images, and modeled them to discover the latent dimensions that shape these judgments. Overall, I found that reachspace similarity is well predicted by the function or purpose of the space, similar to previous findings for both objects and scenes. I next examined clusters in the similarity structure of this image set, and found evidence for conceptual divisions between 5 kinds of reachspaces: those related to eating, electronics, storage, hobbies and chores. Finally, I found that a wide variety of dimensions contribute to these judgments, which only partially overlap with dimensions previously identified for scenes and objects. In the third paper (Josephs & Konkle, 2020), I explored how reachspaces activate visual cortex, and whether their activation profiles differed from scenes and objects. Using functional neuroimaging, I found that reachspaces elicit preferential activity in ventral occipital and dorsal parietal regions, distinct from the regions that prefer scenes and objects. Further experiments found that these reachspace-preferring regions are strongly responsive to views of multiple objects. Altogether, this work has begun to identify the representations that support reachspace processing, and raises the possibility that they rely on different mechanisms than scenes or single objects.

Description

Other Available Sources

Keywords

cognitive neuroscience, cortical organization, objects, reachspaces, scenes, vision, Cognitive psychology, Neurosciences, Psychology

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories