Modulo: Accelerating Development of Neural Module Networks for Generalizable Natural Language Query Systems
Dhara, Raghu V.
MetadataShow full item record
CitationDhara, Raghu V. 2018. Modulo: Accelerating Development of Neural Module Networks for Generalizable Natural Language Query Systems. Bachelor's thesis, Harvard College.
AbstractNeural module networks are shallow, recursive architectures that are capable of complex compositional reasoning. They are comprised of smaller neural networks that are each specialized to answer one specific part of a inference problem. With the assistance of a supplementary assembly network, these modules can be joined together on-the-fly into a computational graph tailored to a given query to produce an output response. This framework is drastically different from the static, deep, and inscrutable state-of-the art neural architectures commonly employed for various problem modalities.
In this work, we formalize modular logic and inference. We then develop a neural module framework capable of responding to natural language queries, particularly regarding images. We propose and evaluate a novel algorithm for efficient model training despite the dynamic and unique computational graphs produced by these modular networks. Lastly, we provide an open source implementation of our work, called modulo, in order to encourage future development in this area. We believe that neural module networks are a promising lead in the race towards artificial generalizable inference capabilities, and that modulo is an ideal platform for developing, training, and evaluating such models.
Citable link to this pagehttp://nrs.harvard.edu/urn-3:HUL.InstRepos:39011597
- FAS Theses and Dissertations