Publication:
Scaling graph neural networks to larger graphs

No Thumbnail Available

Date

2022-05-23

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Zhang, William. 2022. Scaling graph neural networks to larger graphs. Bachelor's thesis, Harvard College.

Research Data

Abstract

Graph-structured data appears abundantly in both the social and natural sciences. How- ever, the common supervised-learning techniques in machine learning are not readily ap- plicable to graph data because they require their inputs to be structured as feature vectors. The current paradigm of learning on graph data is using graph neural networks (GNN). Computation on GNN’s follow the pattern of message-passing of vectors between neigh- boring nodes and message-aggregation. This thesis first motivates and then provides an overview of GNNs. It then discusses the connections between GNNs and Transformers, another recent machine learning model that has taken over the state-of-the-art in many ap- plications. We exploit this connection to develop a procedure that trades off model perfor- mance for training time and execution speed. By clustering nodes together, we can obtain approximations of the representations of each node, and we can leverage this approxima- tion to train and predict with the models much faster. We establish that our method. Addi- tionally, we open source our code for reproducibility of the experiment.

Description

Other Available Sources

Keywords

Computer science

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories