Publication:

Synaptic Failure is a Flat Minima Optimizer

Loading...
Thumbnail Image

Date

2025-03-17

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Singh, Deepak. 2023. Synaptic Failure is a Flat Minima Optimizer. Bachelors Thesis, Harvard University Engineering and Applied Sciences.

Research Data

Abstract

Synaptic failure is a well-known phenomenon in which weaker synapses fail more frequently. Whether this has any purpose is still unknown. In this work, we modify Dropout to implement synaptic failure in artificial neural networks, through a novel activation function we call NormOut. NormOut sets a neuron's probability of successfully firing equal to the ratio $p$ of its activation to the maximum activation in some set of neurons. We propose variants inspired by lateral inhibition and firing thresholds, and show that they have hugely different effects on activation dynamics. We find that NormOut improves the performance of a baseline VGG-16 on CIFAR-10, with one variant even outperforming Dropout in achieving both better test accuracy and a significantly flatter minimum. In addition to the effect on overfitting, we explore NormOut's impact on adversarial robustness against a suite of white and black-box attacks. Intriguingly, we find that some variants of NormOut produce extreme gradient masking without obfuscation. Rather than masking through flattening, we find that these variants actually induc high curvature in the loss landscape, suggesting an as yet unknown form of gradient masking. Overall, we show that simply modelling synaptic failure in two layers has a significant impact on the topology of the loss landscape, with the best implementations of synaptic failure optimizing strongly for flat minima. We claim this as evidence that synaptic failure is a feature, and not a bug, of the brain.

Description

Other Available Sources

Keywords

dropout, flatness, normalization, synaptic failure, Artificial intelligence, Neurosciences

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories