KR2025Proceedings of the 22nd International Conference on Principles of Knowledge Representation and ReasoningProceedings of the 22nd International Conference on Principles of Knowledge Representation and Reasoning

Melbourne, Australia. November 11-17, 2025.

Edited by

ISSN: 2334-1033
ISBN: 978-1-956792-08-9

Sponsored by
Published by

Copyright © 2025 International Joint Conferences on Artificial Intelligence Organization

Halting Recurrent GNNs and the Graded mu-Calculus

  1. Jeroen Bollen(Data Science Institute, Hasselt University, Belgium)
  2. Jan Van den Bussche(Data Science Institute, Hasselt University, Belgium)
  3. Stijn Vansummeren(Data Science Institute, Hasselt University, Belgium)
  4. Jonni Virtema(School of Computer Science, University of Sheffield, UK)

Keywords

  1. Recurrent Graph Neural Networks
  2. Graded Bisimulation
  3. Modal Mu Calculus

Abstract

Graph Neural Networks (GNNs) are a class of machine-learning models that operate on graph-structured data. Their expressive power is intimately related to logics that are invariant under graded bisimilarity. Current proposals for recurrent GNNs either assume that the graph size is given to the model, or suffer from a lack of termination guarantees. In this paper, we propose a halting mechanism for recurrent GNNs. We prove that our halting model can express all node classifiers definable in graded modal mu-calculus, even for the standard GNN variant that is oblivious to the graph size. To prove our main result, we develop a new approximate semantics for graded mu-calculus, which we believe to be of independent interest. We leverage this new semantics into a new model-checking algorithm, called the counting algorithm, which is oblivious to the graph size. In a final step we show that the counting algorithm can be implemented on a halting recurrent GNN.