Home > Seminars > Lingfei Wu - Graph-to-Sequence Learning in Natural Language Processing

Lingfei Wu - Graph-to-Sequence Learning in Natural Language Processing

Start:

3/7/2019 at 3:30PM

End:

3/7/2019 at 4:45PM

Location:

126 DeBartolo

Host:

College of Engineering close button
headerbottom

Collin McMillan

Collin McMillan

VIEW FULL PROFILE Email: collin.mcmillan@nd.edu
Phone: 574-631-1881
Website: http://www.nd.edu/~cmc/
Office: 352 Fitzpatrick Hall

Affiliations

College of Engineering Associate Professor
software engineering, maintenance, repository mining, and search
Click for more information about Collin
574-631-1881
Add to calendar:
iCal vCal

The celebrated Seq2Seq technique and its numerous variants achieve excellent performance on many tasks such as neural machine translation, natural language generation, speech recognition, and drug discovery. Despite their flexibility and expressive power, a significant limitation with the Seq2Seq models is that a neural network can only be applied to problems whose inputs are represented as sequences. However, the sequences are probably the simplest structured data and many important problems are best expressed with a complex structure such as a graph. On one hand, these graph-structured data can encode complicated pairwise relationships for learning more informative representations; On the other hand, the structural and semantic information in sequence data can be exploited to augment original sequence data by incorporating the domain-specific knowledge. To cope with the complex structured graph inputs, we propose Graph2Seq, a novel attention-based neural network architecture for graph-to-sequence learning. Our Graph2Seq can be viewed as a generalized Seq2Seq model for graph inputs, which a general end-to-end neural encoder-decoder architecture that encodes an input graph and decodes the target sequence. In this talk, I will first introduce our Graph2Seq model, and then talk about how to apply this model in different NLP tasks. In particular, we illustrate the advantages of our Graph2Seq model over various Seq2Seq models and Tree2Seq models in our two recent works (“Exploiting Rich Syntactic Information for Semantic Parsing with Graph-to-Sequence Model”, EMNLP 2018) and (“SQL-to-Text Generation with Graph-to-Sequence Model”, EMNLP 2018).

Seminar Speaker:

Lingfei Wu

IBM Watson

Lingfei Wu is a Research Staff Member in the IBM AI Foundations Labs, Reasoning group at IBM T. J. Watson Research Center. He earned his Ph.D. degree in computer science from College of William and Mary in August 2016, under the supervision of Prof. Andreas Stathopoulos. His research interests lie at the intersection of Machine Learning(Deep Learning), Representation Learning, and Natural Language Processing, with a particular emphasis on the fast-growing subjects of Graph Neural Networks and its extensions on new application domains and tasks. Lingfei has published more than 30 top-ranked conference and journal papers, including but not limited to SysML, NIPS, KDD, ICDM, AISTATS, NAACL, EMNLP, AAAI, ICASSP, SC, SIAM Journal on Scientific Computing, IEEE Transaction on Big Data, and Journal of Computational Physics. He is also a co-inventor of more than 13 filed US patents. Lingfei is serving as the Tutorial Chairs of IEEE BigData'18. In addition, he has regularly served as a TPC member of many major AI/ML/DL/DM/NLP conferences, including but not limited to IJCAI, AAAI, NIPS, ICML, ICLR, KDD, and ACL.