BERT-based Semantic Query Graph Extraction for Knowledge Graph Question Answering

Authors
Liang, Zhicheng
Peng, Zixuan
Yang, Xuefeng
Zhao, Fubang
Liu, Yunfeng
McGuinness, Deborah L.
ORCID
No Thumbnail Available
Other Contributors
Issue Date
2021
Keywords
Degree
Terms of Use
Attribution-NonCommercial-NoDerivs 3.0 United States
Full Citation
Zhicheng Liang, Zixuan Peng, Xuefeng Yang, Fubang Zhao, Yunfeng Liu, Deborah L. McGuinness (2021). BERT-based Semantic Query Graph Extraction for Knowledge Graph Question Answering. International Semantic Web Conference. October 2021. †*
Abstract
Answering complex questions involving multiple entities and relations remains a challenging Knowledge Graph Question Answering (KGQA) task. To extract a Semantic Query Graph (SQG), we propose a BERT-based decoder that is capable of jointly performing multi-tasks for SQG construction, such as entity detection, relation prediction, output variable selection, query type classification and ordinal constraint detection. The outputs of our model can be seamlessly integrated with downstream components (e.g. entity linking) of a KGQA pipeline to construct a formal query. The results of our experiments show that our proposed BERT-based semantic query graph extractor achieves better performance than traditional recurrent neural network based extractors. Meanwhile, the KGQA pipeline based on our model outperforms baseline approaches on two benchmark datasets (LC-QuAD, WebQSP) containing complex questions.
Description
Department
Publisher
CEUR-WS
Relationships
Access