Publication: Stable and Efficient Representation Learning with Nonnegativity Constraints
Open/View Files
Date
2014
Authors
Published Version
Published Version
Journal Title
Journal ISSN
Volume Title
Publisher
Journal of Machine Learning Research
The Harvard community has made this article openly available. Please share how this access benefits you.
Citation
Lin, T. H., and H. T. Kung. 2014. "Stable and Efficient Representation Learning with Nonnegativity Constraints." In Proceedings of the 31st International Conference on Machine Learning (ICML 2014), Beijing, China, June 22-24, 2014. Journal of Machine Learning Research: W&CP 32: 1323-1331.
Research Data
Abstract
Orthogonal matching pursuit (OMP) is an efficient approximation algorithm for computing sparse representations. However, prior research has shown that the representations computed by OMP may be of inferior quality, as they deliver suboptimal classification accuracy on several im- age datasets. We have found that this problem is caused by OMP’s relatively weak stability under data variations, which leads to unreliability in supervised classifier training. We show that by imposing a simple nonnegativity constraint, this nonnegative variant of OMP (NOMP) can mitigate OMP’s stability issue and is resistant to noise overfitting. In this work, we provide extensive analysis and experimental results to examine and validate the stability advantage of NOMP. In our experiments, we use a multi-layer deep architecture for representation learning, where we use K-means for feature learning and NOMP for representation encoding. The resulting learning framework is not only efficient and scalable to large feature dictionaries, but also is robust against input noise. This framework achieves the state-of-the-art accuracy on the STL-10 dataset.
Description
Other Available Sources
Keywords
Terms of Use
This article is made available under the terms and conditions applicable to Open Access Policy Articles (OAP), as set forth at Terms of Service