Deep Sparse-coded Network (DSN)

DSpace/Manakin Repository

Deep Sparse-coded Network (DSN)

Citable link to this page

 

 
Title: Deep Sparse-coded Network (DSN)
Author: Cha, Miriam; Gwon, Youngjune Lee ORCID  0000-0002-2292-7320 ; Kung, H. T.

Note: Order does not necessarily reflect citation order of authors.

Citation: Gwon, Youngjune, Miriam Cha; H. T. Kung. 2016. Deep Sparse-coded Network (DSN). In Proceedings of 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, December 4-8, 2016. doi: 10.1109/ICPR.2016.7900029
Full Text & Related Files:
Abstract: We introduce Deep Sparse-coded Network (DSN), a deep architecture based on sparse coding and dictionary learning. Key advantage of our approach is two-fold. By interlacing max pooling with sparse coding layer, we achieve nonlinear activation analogous to neural networks, but suffering less from diminished gradients. We use a novel backpropagation algorithm to finetune our DSN beyond the pretraining by layer-by-layer sparse coding and dictionary learning. We build an experimental 4-layer DSN with the `1-regularized LARS and greedy-`0 OMP and demonstrate superior performance over deep stacked autoencoder on CIFAR-10.
Published Version: doi:10.1109/ICPR.2016.7900029
Terms of Use: This article is made available under the terms and conditions applicable to Other Posted Material, as set forth at http://nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of-use#LAA
Citable link to this page: http://nrs.harvard.edu/urn-3:HUL.InstRepos:34903188
Downloads of this work:

Show full Dublin Core record

This item appears in the following Collection(s)

 
 

Search DASH


Advanced Search
 
 

Submitters