GraphFlow+: Exploiting Conversation Flow in Conversational Machine Comprehension with Graph Neural Networks
-
Graphical Abstract
-
Abstract
The conversation machine comprehension (MC) task aims to answer questions in the multi-turn conversation for a single passage. However, recent approaches don′t exploit information from historical conversations effectively, which results in some references and ellipsis in the current question cannot be recognized. In addition, these methods do not consider the rich semantic relationships between words when reasoning about the passage text. In this paper, we propose a novel model GraphFlow+, which constructs a context graph for each conversation turn and uses a unique recurrent graph neural network (GNN) to model the temporal dependencies between the context graphs of each turn. Specifically, we exploit three different ways to construct text graphs, including the dynamic graph, static graph, and hybrid graph that combines the two. Our experiments on CoQA, QuAC and DoQA show that the GraphFlow+ model can outperform the state-of-the-art approaches.
-
-