Adaptation and Evidence Accumulation in Neural Codes
Citation
Rast, Luke. 2021. Adaptation and Evidence Accumulation in Neural Codes. Doctoral dissertation, Harvard University Graduate School of Arts and Sciences.Abstract
Adaptation and evidence accumulation are two important processes that must be performedby neural codes. They represent means of responding to the changing environment on different timescales, and can be modeled well by efficient coding and Bayesian inference respectively. In this thesis, we study in turn models of how each of these processes might be implemented in neural codes. We start with efficient codes, and develop a general efficient coding model that is flexible in its choice of encoding, objective, and constraints. This allows us to identify the influence of each of these factors on the neural codes and their adaptation. With such a general model in hand, we then show that both the objective and constraint function that describe an observed neural code can be fit based on the adaptation properties of the code, and demonstrate how this could be accomplished by developing a novel experimental scheme. Such a method allows the efficient coding idea to be turned around, finding optimization problems that account for observed properties of neural codes. We then turn to the question of evidence accumulation and derive previously known filtering solu- tions for a fixed exponential family representation. We propose such a model for heading direction representations in the brain and propose a scheme to learn them in artificial neural networks.
Terms of Use
This article is made available under the terms and conditions applicable to Other Posted Material, as set forth at http://nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of-use#LAACitable link to this page
https://nrs.harvard.edu/URN-3:HUL.INSTREPOS:37370253
Collections
- FAS Theses and Dissertations [5370]
Contact administrator regarding this item (to report mistakes or request changes)