CS671: Homework 3 - Paper Review

Assignment description can be found here.

Paper Description

ID: 26
Title: Parsing with Compositional Vector Grammars [PDF]
Authors: Richard Socher, John Bauer, Christopher D Manning and Andrew Y Ng
Presentation: [slides]
Abstract:
Natural language parsing has typically been done with small sets of discrete categories such as NP and VP, but this representation does not capture the
full syntactic nor semantic richness of linguistic phrases, and attempts to improve on this by lexicalizing phrases or splitting categories only partly
address the problem at the cost of huge feature spaces and sparseness. Instead, we introduce a Compositional Vector Grammar (CVG), which
combines PCFGs with a syntactically untied recursive neural network that learns syntactico-semantic, compositional vector representations. The CVG
improves the PCFG of the Stanford Parser by 3.8% to obtain an F1 score of 90.4%. It is fast to train and implemented approximately as an efficient
reranker it is about 20% faster than the current Stanford factored parser. The CVG learns a soft notion of head words and improves performance
on the types of ambiguities that require semantic information such as PP attachments.

Class Presentation

ID: 17
Title: An Exploration of Embeddings for Generalized Phrases [PDF]
Authors: Wenpeng Yin and Hinrich Schutze
Presentation: [slides] [source].
Abstract:
Deep learning embeddings have been successfully used for many natural language processing problems. Embeddings are mostly computed for word 
forms although lots of recent papers have extended this to other linguistic units like morphemes and word sequences. In this paper, we define the 
concept of generalized phrase that includes conventional linguistic phrases as well as skip-bigrams. We compute embeddings for generalized phrases and 
show in experimental evaluations on coreference resolution and paraphrase identification that such embeddings perform better than word form embedding.

All Presentations

9th September
  1. [slides] [paper] Multimodal Neural Language Models
  2. [slides] [paper] Embeddings for Generalized Phrases
  3. [slides] [paper] Adjective Meanings with a Tensor-Based Skip-Gram Model
  4. [slides] [paper] Compositional Perceptually Grounded Word Learning

11th September
  1. [slides] [paper] Abstractive Multi-Document Summarization
  2. [slides] [paper] Modeling Argument Strength in Student Essays
  3. [slides] [paper] Event-Driven Headline Generation
  4. [slides] [paper] Visualizing and Understanding Neural Models in NLP

Hrishikesh Terdalkar

CSE - 14111265
{ hrishirt } @ iitk
Back | Up | Home