Trip blog - CMU & NAACL 2016

Pittsburgh & San Diego, USA
June, 08 - June, 17

Sharmistha's account of Carnegie Mellon University (CMU) and NAACL 2016 Conference visit

Carnegie Mellon University Visit

I visited NELL Research group at CMU and interacted with the group members. We had interesting discussions regarding the current research directions of the group. In particular we spoke about ProPPR, Structure Learning, Micro Reading and applications of knowlegdge bases. An interesting direction of work seems to be in automatic inference of ontologies in new domains with applications in question answering.

I would like to thank Derry Wijaya for hosting me at CMU.

Conference Introduction

North American Chapter of the Association for Computational Linguistics (NAACL) is a well known conference in the field of NLP. The conference had interesting discussions around current NLP challenges. Deep Learning was a recurrent theme in this year's presentations. There were various events like tutorials, main conference and workshops for 6 days of the conference. I presented my work on domain adaptation and SEMEVAL iSTS challenge's paper.


There were interesting keynotes by Regina Barzilay and Ehud Reiter. Prof. Regina spoke on the topic of "How Can NLP Help Cure Cancer?". She spoke in favour of reliable information extraction from medical text and the construction of interpretable models to assist doctors. Prof. Ehud Reiter spoke about evaluating natural language generation systems.


There were a bunch of interesting workshops hosted at NAACL. I attended SEMEVAL and AKBC workshops.

AKBC had the most number of keynotes, with lots of information packed in. Most of the discussions revolved around the general theme of knowledge bases and their construction. Prof. Andrew McCallum spoke about universal schema. Prof. Christopher Manning presented reading comprehension. Prof. Van Durme mentioned a DARPA project to construct KB on the fly and Universal Decomposition Semantics.

Interesting Papers and Demonstrations

NAACL had a bunch of very interesting papers, following papers fascinated me (apart from the best papers):

1.Recurrent Neural Network Grammars
Authors : Chris Dyer; Adhiguna Kuncoro; Miguel Ballesteros; Noah A. Smith
Basic idea of the paper is that languages are hierarchical in nature. Therefore, recurrent networks can be used for better parsing and language modelling.

2. Learning Composition Models for Phrase Embeddings [TACL]
Authors :Mo Yu and Mark Dredze
Novel task specific learning objective to learn composition of word embeddings into phrase embeddings.

3. System Demonstrations: Illinois Math Solver: Math reasoning on the web Demo Link
Authors : Subhro Roy and Dan Roth
Answers queries like ”I bought 6 apples and ate 3, how many do I have left ?”


I thank my XRCI collaborators for NAACL work and Lavanya Sita Tekumalla for SEMEVAL project. It was fun presenting the work at the conference. I would also like to thank Prof. Partha Pratim Talukdar and GARP (Government of India Funds) for funding this trip to NAACL 2016.