Deep Sparse-coded Network (DSN)
MetadataShow full item record
CitationGwon, Youngjune, Miriam Cha; H. T. Kung. 2016. Deep Sparse-coded Network (DSN). In Proceedings of 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, December 4-8, 2016. doi: 10.1109/ICPR.2016.7900029
AbstractWe introduce Deep Sparse-coded Network (DSN), a deep architecture based on sparse coding and dictionary learning. Key advantage of our approach is two-fold. By interlacing max pooling with sparse coding layer, we achieve nonlinear activation analogous to neural networks, but suffering less from diminished gradients. We use a novel backpropagation algorithm to finetune our DSN beyond the pretraining by layer-by-layer sparse coding and dictionary learning. We build an experimental 4-layer DSN with the `1-regularized LARS and greedy-`0 OMP and demonstrate superior performance over deep stacked autoencoder on CIFAR-10.
Citable link to this pagehttp://nrs.harvard.edu/urn-3:HUL.InstRepos:34903188
- FAS Scholarly Articles