Now showing items 1-6 of 6

    • Antecedent Prediction Without a Pipeline 

      Wiseman, Sam Joshua; Rush, Alexander Matthew; Shieber, Stuart Merrill (Association for Computational Linguistics, 2016)
      We consider several antecedent prediction models that use no pipelined features generated by upstream systems. Models trained in this way are interesting because they allow for side-stepping the intricacies of upstream ...
    • Challenges in Data-to-Document Generation 

      Wiseman, Sam Joshua; Shieber, Stuart Merrill; Rush, Alexander Sasha Matthew (Association for Computational Linguistics, 2017)
      Recent neural models have shown significant progress on the problem of generating short descriptive texts conditioned on a small number of database records. In this work, we suggest a slightly more difficult data-to-text ...
    • Induction of Probabilistic Synchronous Tree-Insertion Grammars 

      Nesson, Rebecca; Shieber, Stuart Merrill; Rush, Alexander Matthew (2005)
      Increasingly, researchers developing statistical machine translation systems have moved to incorporate syntactic structure in the models that they induce. These researchers are motivated by the intuition that the limitations ...
    • Learning Anaphoricity and Antecedent Ranking Features for Coreference Resolution 

      Wiseman, Sam Joshua; Rush, Alexander Matthew; Shieber, Stuart Merrill; Weston, Jason (Association for Computational Linguistics, 2015)
      We introduce a simple, non-linear mention-ranking model for coreference resolution that attempts to learn distinct feature representations for anaphoricity detection and antecedent ranking, which we encourage by pre-training ...
    • Sentence-level grammatical error identification as sequence-to-sequence correction 

      Schmaltz, Allen Richard; Kim, Yoon; Rush, Alexander Matthew; Shieber, Stuart Merrill (Association of Computational Linguistics, 2016)
      We demonstrate that an attention-based encoder-decoder model can be used for sentence-level grammatical error identification for the Automated Evaluation of Scientific Writing (AESW) Shared Task 2016. The attention-based ...
    • Word Ordering Without Syntax 

      Schmaltz, Allen Richard; Rush, Alexander Matthew; Shieber, Stuart Merrill (Association for Computational Linguistics, 2016)
      Recent work on word ordering has argued that syntactic structure is important, or even required, for effectively recovering the order of a sentence. We find that, in fact, an n-gram language model with a simple heuristic ...