Show simple item record

dc.contributor.authorCohen, Michael Sharpe
dc.contributor.authorAlvarez, George Angelo
dc.contributor.authorNakayama, Ken
dc.contributor.authorKonkle, Talia A
dc.date.accessioned2017-09-29T20:47:04Z
dc.date.issued2016
dc.identifier.citationCohen, Michael A., George A. Alvarez, Ken Nakayama, and Talia Konkle. 2016. “Visual Search for Object Categories Is Predicted by the Representational Architecture of High-Level Visual Cortex.” Journal of Neurophysiology 117 (1) (November 2): 388–402. doi:10.1152/jn.00569.2016.en_US
dc.identifier.issn0022-3077en_US
dc.identifier.urihttp://nrs.harvard.edu/urn-3:HUL.InstRepos:33973830
dc.description.abstractVisual search is a ubiquitous visual behavior, and efficient search is essential for survival. Different cognitive models have explained the speed and accuracy of search based either on the dynamics of attention or on similarity of item representations. Here, we examined the extent to which performance on a visual search task can be predicted from the stable representational architecture of the visual system, independent of attentional dynamics. Participants performed a visual search task with 28 conditions reflecting different pairs of categories (e.g., searching for a face amongst cars, body amongst hammers, etc.). The time it took participants to find the target item varied as a function of category combination. In a separate group of participants, we measured the neural responses to these object categories when items were presented in isolation. Using representational similarity analysis, we then examined whether the similarity of neural responses across different subdivisions of the visual system had the requisite structure needed to predict visual search performance. Overall, we found strong brain/behavior correlations across most of the higher-level visual system, including both the ventral and dorsal pathways when considering both macro-scale sectors as well as smaller meso-scale regions. These results suggest that visual search for real-world object categories is well predicted by the stable, task-independent architecture of the visual system.en_US
dc.description.sponsorshipPsychologyen_US
dc.language.isoen_USen_US
dc.publisherAmerican Physiological Societyen_US
dc.relation.isversionofdoi:10.1152/jn.00569.2016en_US
dash.licenseOAP
dc.titleVisual search for object categories is predicted by the representational architecture of high-level visual cortexen_US
dc.typeJournal Articleen_US
dc.description.versionAccepted Manuscripten_US
dc.relation.journalJournal of Neurophysiologyen_US
dash.depositing.authorAlvarez, George Angelo
dc.date.available2017-09-29T20:47:04Z
dc.identifier.doi10.1152/jn.00569.2016*
workflow.legacycommentsFAR2016 file.contents -- are line numbers ok or should we request a better MS version?en_US
dash.contributor.affiliatedCohen, Michael Sharpe
dash.contributor.affiliatedKonkle, Talia
dash.contributor.affiliatedNakayama, Ken
dash.contributor.affiliatedAlvarez, George


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record