Publication:
Modulo: Accelerating Development of Neural Module Networks for Generalizable Natural Language Query Systems

No Thumbnail Available

Date

2018-06-29

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Dhara, Raghu V. 2018. Modulo: Accelerating Development of Neural Module Networks for Generalizable Natural Language Query Systems. Bachelor's thesis, Harvard College.

Research Data

Abstract

Neural module networks are shallow, recursive architectures that are capable of complex compositional reasoning. They are comprised of smaller neural networks that are each specialized to answer one specific part of a inference problem. With the assistance of a supplementary assembly network, these modules can be joined together on-the-fly into a computational graph tailored to a given query to produce an output response. This framework is drastically different from the static, deep, and inscrutable state-of-the art neural architectures commonly employed for various problem modalities. In this work, we formalize modular logic and inference. We then develop a neural module framework capable of responding to natural language queries, particularly regarding images. We propose and evaluate a novel algorithm for efficient model training despite the dynamic and unique computational graphs produced by these modular networks. Lastly, we provide an open source implementation of our work, called modulo, in order to encourage future development in this area. We believe that neural module networks are a promising lead in the race towards artificial generalizable inference capabilities, and that modulo is an ideal platform for developing, training, and evaluating such models.

Description

Other Available Sources

Keywords

Computer Science

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories