Show simple item record

dc.contributor.authorStothers, Duncan Bayard
dc.date.accessioned2020-08-28T09:33:08Z
dc.date.created2019-05
dc.date.issued2019-08-23
dc.date.submitted2019
dc.identifier.citationStothers, Duncan Bayard. 2019. Turing’s Child Machine: A Deep Learning Model of Neural Development. Bachelor's thesis, Harvard College.
dc.identifier.urihttps://nrs.harvard.edu/URN-3:HUL.INSTREPOS:37364616*
dc.description.abstractThe connection between general intelligence and development was first raised by Turing in the 1950s. Noting the incredible complexity of engineering the adult mind, he proposed instead building a child machine with a simple mind that develops into a complex adult one. The goal of building an intelligent machine, then, is refocused around engineering the neurological stages of development. It was later discovered that during development the child’s brain undergoes pervasive network expansion, increasing at least an order of magnitude in size, followed by activity driven pruning - a process for which the computational role is still unknown. In the parallel world of artificial intelligence research, hand-designing deep neural networks involves architecture decisions which are often guided by intuition and trial-and-error. Architectures typically stay fixed during parameter learning, a stark contrast to the extreme architecture modifications that take place during development in a child’s brain. Furthermore, it is widely understood that there is a ’small network design problem’ when hand-designing deep networks. Building a smaller deep architecture that generalizes as well as a big one requires much more effort as well as more complex layers and connections. Here we model biological development using densely connected as well as convolutional deep neural networks as a means to further our understanding of neurological biological intelligence and computational artificial intelligence. Empirical results suggest the computational role of synaptic overgrowth and pruning in biology is as an unsupervised architecture search process that finds exponentially smaller architectures that generalize well. Resultant ’adult’ convolutional networks that develop this way also show similarities to hand-designed networks.
dc.description.sponsorshipComputer Science
dc.description.sponsorshipComputer Science
dc.format.mimetypeapplication/pdf
dc.language.isoen
dash.licenseLAA
dc.titleTuring’s Child Machine: A Deep Learning Model of Neural Development
dc.typeThesis or Dissertation
dash.depositing.authorStothers, Duncan Bayard
dc.date.available2020-08-28T09:33:08Z
thesis.degree.date2019
thesis.degree.grantorHarvard College
thesis.degree.grantorHarvard College
thesis.degree.levelUndergraduate
thesis.degree.levelUndergraduate
thesis.degree.nameAB
thesis.degree.nameAB
dc.type.materialtext
thesis.degree.departmentComputer Science
thesis.degree.departmentComputer Science
thesis.degree.discipline-jointMind Body Behavior
dash.identifier.vireo
dc.identifier.orcid0000-0001-6873-851X
dash.author.emailduncanstothers3699@gmail.com


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record