Adaptation and Evidence Accumulation in Neural Codes
MetadataShow full item record
CitationRast, Luke. 2021. Adaptation and Evidence Accumulation in Neural Codes. Doctoral dissertation, Harvard University Graduate School of Arts and Sciences.
AbstractAdaptation and evidence accumulation are two important processes that must be performed
by neural codes. They represent means of responding to the changing environment on different timescales, and can be modeled well by efficient coding and Bayesian inference respectively. In this thesis, we study in turn models of how each of these processes might be implemented in neural codes. We start with efficient codes, and develop a general efficient coding model that is flexible in its choice of encoding, objective, and constraints. This allows us to identify the influence of each of these factors on the neural codes and their adaptation. With such a general model in hand, we then show that both the objective and constraint function that describe an observed neural code can be fit based on the adaptation properties of the code, and demonstrate how this could be accomplished by developing a novel experimental scheme. Such a method allows the efficient coding idea to be turned around, finding optimization problems that account for observed properties of neural codes. We then turn to the question of evidence accumulation and derive previously known filtering solu- tions for a fixed exponential family representation. We propose such a model for heading direction representations in the brain and propose a scheme to learn them in artificial neural networks.
Citable link to this pagehttps://nrs.harvard.edu/URN-3:HUL.INSTREPOS:37370253
- FAS Theses and Dissertations 
Contact administrator regarding this item (to report mistakes or request changes)