Stack-LSTM Joint Syntactic and Semantic Parser.

Transition-based system that uses LSTMs to infer semantic roles and syntactic trees, in collaboration with Swabha Swayamdipta, Chris Dyer and Noah Smith. The system will be presented at CoNLL 2016 conference via this paper.

Stack-LSTM Named Entity Recognizer.

Transition-based system that uses LSTMs to infer named entitys in plain text, in collaboration with Guillaume Lample, Sandeed Subramanian, Kazuya Kawakami and Chris Dyer. The system will be presented at NAACL 2016 conference via this paper.

Language Universal Transition-based parser

This system uses LSTMs to learn parser states representations and it uses multilingual word embeddings, in collaboration with Waleed Ammar, George Mulcaire, Chris Dyer and Noah Smith. The system will be presented at ACL 2016 conference via this TACL 2016 article paper.

Recurrent Neural Network Grammars

This system uses LSTMs to solve language modeling and phrase structure parsing with a generative model, in collaboration with Chris Dyer, Adhiguna Kuncoro and Noah Smith. The system will be presented at NAACL 2016 conference via this paper paper.

LSTM Transition based parser.

Arc-standard transition-based parser that uses LSTMs to learn parser states representations, in collaboration with Chris Dyer, Wang Ling, Austin Mathews and Noah Smith. The LSTM parser will be presented at ACL 2015 conference via this paper.

Deep-Syntactic Parser

Dependency parser that delivers deep (mtt's) syntactic structures as output, in collaboration with Bernd Bohnet, Simon Mille and Leo Wanner. The deep parser was presented at COLING 2014 conference via this paper. You can try an online version of the parser at http://dparse.multisensor.taln.upf.edu/main which will be presented as a demo paper at NAACL-HLT 2015 conference.

Data-driven sentence generator

Data-driven sentence generation that copes with the projection between non-isomorphic structures that differ in their topology and number of nodes since an abstract structure from which the generation naturally starts often does not contain any functional nodes, while a surface-syntactic structure or a chain of tokens in a linearized tree contains all of them. In collaboration with Bernd Bohnet, Simon Mille and Leo Wanner. The data-driven sentence generation was presented at NAACL-HLT 2015 conference via this paper

MaltOptimizer

MaltOptimizer is a freely available tool developed to facilitate parser optimization using the open-source system MaltParser (MaltOptimizer website) in collaboration with Joakim Nivre. It was presented in LREC 2012 (paper)and EACL 2012 (paper). There is Morphologically rich version presented in this paper. There is also an article in Natural language Engineering Journal.

Dependency Parser

This is a simple implementation of Nivre's arc-eager parsing algorithm (see this paper) by using LibSVM as machine learner. It only has ~500 lines of Java code. It uses some of the features found useful in MaltOptimizer and the root position proposed in this article (find the parser, here)

MaltDiver

MaltDiver is a tool developed to visualize the transitions performed by the transition-based parsers included in MaltParser and to show how the parsers interact with the sentences and the data structures within. (MaltDiver website) in collaboration with Roberto Carlini.



I'm also collaborating in the following projects:

Mate parser.

Bernd Bohnet and I implemented a feature selection algorithm for Bernd Bohnet's joint transition-based parser. You can check the feature templates and parser models here. The results of these experiments were presented at COLING 2014 conference via this paper

ZPar

Yue Zhang and I are currently implementing new tools and features for Yue Zhang's zPar parser.