KR2020Proceedings of the 17th International Conference on Principles of Knowledge Representation and ReasoningProceedings of the 17th International Conference on Principles of Knowledge Representation and Reasoning

Rhodes, Greece. September 12-18, 2020.

Edited by

ISSN: 2334-1033
ISBN: 978-0-9992411-7-2

Sponsored by
Published by

Copyright © 2020 International Joint Conferences on Artificial Intelligence Organization

Seq2KG: An End-to-End Neural Model for Domain Agnostic Knowledge Graph (not Text Graph) Construction from Text

  1. Michael Stewart(The University of Western Australia)
  2. Wei Liu(The University of Western Australia)

Keywords

  1. Knowledge graphs, virtual knowledge graphs, and open linked data-General

Abstract

Knowledge Graph Construction (KGC) from text unlocks information held within unstructured text and is critical to a wide range of downstream applications. General approaches to KGC from text are heavily reliant on the existence of knowledge bases, yet most domains do not even have an external knowledge base readily available. In many situations this results in information loss as a wealth of key information is held within "non-entities". Domain-specific approaches to KGC typically adopt unsupervised pipelines, using carefully crafted linguistic and statistical patterns to extract co-occurred noun phrases as triples, essentially constructing text graphs rather than true knowledge graphs. In this research, for the first time, in the same flavour as Collobert et al.'s seminal work of "Natural language processing (almost) from scratch" in 2011, we propose a Seq2KG model attempting to achieve "Knowledge graph construction (almost) from scratch". An end-to-end Sequence to Knowledge Graph (Seq2KG) neural model jointly learns to generate triples and resolves entity types as a multi-label classification task through deep learning neural networks. In addition, a novel evaluation metric that takes both semantic and structural closeness into account is developed for measuring the performance of triple extraction. We show that our end-to-end Seq2KG model performs on par with a state of the art rule-based system which outperformed other neural models and won the first prize of the first Knowledge Graph Contest in 2019. A new annotation scheme and three high-quality manually annotated datasets are available to help promote this direction of research.