Foundations of Graph Neural Networks
Egor Kostylev CNRS & Gustave Eiffel University

Graph Neural Networks (GNNs) are a recent family of neural architectures that are naturally suited to learning functions on graphs. They are now used in a wide range of applications. It has been observed that GNNs share many similarities with classical computer science (CS) formalisms, such as the Weisfeiler-Leman graph isomorphism test, bisimulation, and logic. Most notably, both GNNs and these formalisms deal with functions on graphs and graph-like structures. This observation opens up an opportunity to compare GNN architectures with these formalisms in terms of different kinds of expressibility, thus positioning these architectures within the well-established landscape of theoretical CS. This, in turn, helps us better understand the fundamental capabilities and limitations of various GNN architectures, enabling more informed choices about which architecture to use—if any at all.
In these lectures, I will give an introduction to the state-of-the-art foundations of GNNs—specifically, our current understanding of their expressibility in terms of the classical formalisms, considering several notions of expressive power. We will begin with a gentle introduction to GNNs as a mathematical object, including their classification and basic properties. We will see that they are actually a family of architectures of a hybrid nature: they have both neural—that is, numeric—and structural—that is, symbolic—components, with an interplay between them. Then, we will turn our attention to the different notions of expressive power. We will start with the historically first, and, in some sense, the weakest notion—that is, non-uniform distinguishing power, where a yardstick is Weisfeiler-Leman graph isomorphism test. We will then move to a stronger notion of uniform expressive power, comparing GNNs with a special type of bisimulation and various logics, such as Graded Modal Logic and its extensions. Finally, we will have a look at other types of expressivity, including approximation power and expressivity over restricted classes of graphs. Overall, this is an active area of research and the overall picture is by far incomplete. Thus, towards the end, we will discuss promising directions and open questions related to the foundations of GNNs.
About the lecturer:Prof. Egor V. Kostylev (Department of Informatics, University of Oslo, Norway) obtained his Specialist (2004) and PhD (2009) degrees from the Faculty of Computational Mathematics and Cybernetics, the Lomonosov Moscow State University, in the area of Discrete Mathematics and Mathematical Cybernetics. After this and before starting his current position, he was a researcher at Laboratory for Foundations of Computer Science, School of Informatics, University of Edinburgh, UK (2010-2013), and Departmental Lecturer at the Department of Computer Science, University of Oxford, UK (2013-2020). Since 2025, he is also a research theme leader at Norwegian Centre for Knowledge-driven Machine Learning Integreat. His research interests include the theory of AI, in particular, knowledge representation and reasoning (including knowledge graphs and ontologies), and its connections to neural networks (including graph NNs), databases, and semantic technologies. He authored more than 80 scientific papers published in top journals and conferences in the field, two of which received best paper awards (ICDT’16 and IJCAI’17).
Location and schedule:
|
|
|