DOI: https://doi.org/10.15368/theses.2021.18
Available at: https://digitalcommons.calpoly.edu/theses/2265
Date of Award
3-2021
Degree Name
MS in Computer Science
Department/Program
Computer Science
College
College of Engineering
Advisor
Franz Kurfess
Advisor Department
Computer Science
Advisor College
College of Engineering
Abstract
Knowledge Graphs are fascinating concepts in machine learning as they can hold usefully structured information in the form of entities and their relations. Despite the valuable applications of such graphs, most knowledge bases remain incomplete. This missing information harms downstream applications such as information retrieval and opens a window for research in statistical relational learning tasks such as node classification and link prediction. This work proposes a deep learning framework based on existing relational convolutional (R-GCN) layers to learn on highly multi-relational data characteristic of realistic knowledge graphs for node property classification tasks. We propose a deep and improved variant, Deep-RGCNs, with dense and residual skip connections between layers. These skip connections are known to be very successful with popular deep CNN-architectures such as ResNet and DenseNet. In our experiments, we investigate and compare the performance of Deep-RGCN with different baselines on multi-relational graph benchmark datasets, AIFB and MUTAG, and show how the deep architecture boosts the performance in the task of node property classification. We also study the training performance of Deep-RGCNs (with N layers) and discuss the gradient vanishing and over-smoothing problems common to deeper GCN architectures.
Included in
Artificial Intelligence and Robotics Commons, Categorical Data Analysis Commons, Data Science Commons