dc.contributor.author | Dahl, George E. | |
dc.contributor.author | Adams, Ryan Prescott | |
dc.contributor.author | Larochelle, Hugo | |
dc.date.accessioned | 2013-12-13T15:29:56Z | |
dc.date.issued | 2012 | |
dc.identifier | Quick submit: 2013-08-08T12:15:38-04:00 | |
dc.identifier.citation | Dahl, George E., Ryan Prescott Adams, and Hugo Larochelle. 2012. Training restricted Boltzmann machines on word observations. In Proceedings of the 29th International Conference on Machine Learning, Edinburgh, Scotland, June 26 – July 1, 2012, ed. John Langford and Joelle Pineau, 679-686. Edinburgh: International Machine Learning Society. | en_US |
dc.identifier.isbn | 9781450312851 | en_US |
dc.identifier.uri | http://nrs.harvard.edu/urn-3:HUL.InstRepos:11375693 | |
dc.description.abstract | The restricted Boltzmann machine (RBM) is a flexible tool for modeling complex data, however there have been significant computational difficulties in using RBMs to model high-dimensional multinomial observations. In natural language processing applications, words are naturally modeled by K-ary discrete distributions, where K is determined by the vocabulary size and can easily be in the hundreds of thousands. The conventional approach to training RBMs on word observations is limited because it requires sampling the states of K-way softmax visible units during block Gibbs updates, an operation that takes time linear in K. In this work, we address this issue by employing a more general class of Markov chain Monte Carlo operators on the visible units, yielding updates with computational complexity independent of K. We demonstrate the success of our approach by training RBMs on hundreds of millions of word n-grams using larger vocabularies than previously feasible and using the learned features to improve performance on chunking and sentiment classification tasks, achieving state-of-the-art results on the latter. | en_US |
dc.description.sponsorship | Engineering and Applied Sciences | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | International Machine Learning Society | en_US |
dc.relation.isversionof | http://icml.cc/2012/papers/364.pdf | en_US |
dc.relation.hasversion | http://arxiv.org/pdf/1202.5695v2.pdf | en_US |
dash.license | OAP | |
dc.subject | learning | en_US |
dc.subject | machine learning | en_US |
dc.title | Training Restricted Boltzmann Machines on Word Observations | en_US |
dc.type | Conference Paper | en_US |
dc.date.updated | 2013-08-08T16:16:08Z | |
dc.description.version | Author's Original | en_US |
dc.rights.holder | George E. Dahl; Ryan Prescott Adams; Hugo Larochelle | |
dash.depositing.author | Adams, Ryan Prescott | |
dc.date.available | 2013-12-13T15:29:56Z | |
dc.relation.book | Proceedings of the 29th International Conference on Machine Learning | en_US |
workflow.legacycomments | FLAG2 I'm not sure about posting the publisher's version. It's possible, even likely, though, that the "publisher's version" is identical to the manuscript in this case. This looks to me like it was built with a LaTeX template before hand, and was not altered afterwards. No page numbers, etc. If this is actually a manuscript, we can post OAP. Committed 12/13/13 by eek per CS convention of using LaTeX and statement on first page of "Appearing in...", making a call to deposit this OAP. | en_US |
dash.contributor.affiliated | Adams, Ryan Prescott | |