March 31 | 9:30 a.m. MDT
Md Athikul Islam – Computer Science
Online only: Please pre-register to attend via Zoom using this link.
Title: Augmenting Graph Construction and Representation Learning for Graph-Based Natural Language Processing Models
Abstract: Graph-based Natural Language Processing (NLP) models have become a popular trend due to their expressive power and ability to work with complex data structures. Conventional machine learning (ML) and deep learning (DL) models may lose important information if they are applied to complex graph-structured data. During initial preprocessing, graph-based models typically transform the text into a graph for an NLP task. While it may seem redundant, leveraging the power of various types of graphs, including static and dynamic graphs, graph construction is actually an important step. Practitioners have employed several machine learning models, particularly Graph Neural Networks (GNNs) to understand the complex dependencies of these constructed graphs. These GNNs are expressive based on their different types of aggregation, message passing, and transformation techniques. The combination of graph construction and representation learning approaches has outperformed several baseline models for various NLP tasks. Although graph-based models have been successful in NLP, there are still some areas where improvements can be made.