Open access
Author
Date
2024Type
- Doctoral Thesis
ETH Bibliography
yes
Altmetrics
Abstract
This thesis studies improvements for message-passing Graph Neural Networks (GNNs) from two angles. First, we tackle the problem of understanding how GNNs reason. We quantify how much GNN architectures achieve the combined reasoning over features and edges that GNNs uniquely offer. We then present two methods to explain GNN predictions. The first explanation method explains graphs by contrasting them to similar examples. The second explanation method uses decision trees to build recipes that explain the important inputs and how the GNN uses these inputs. We finish our understanding chapter by examining current explanation datasets. We discover several problems with these datasets that can lead to wrong conclusions. Instead, we propose three new datasets to evaluate explanation methods.\\
For the second angle, we investigate the limits of message passing. We identify the message aggregation step to potentially lose much information, contributing to problems identified by past research, such as Oversmoothing, Underreaching, or the $1-$WL expressiveness limit. We propose a new architecture that avoids aggregation entirely and processes individual messages asynchronously. We also identify a problem with neural architectures to learn comparisons between numbers and propose a new architecture well-aligned to learn comparisons. For example, we can use this architecture in a GNN setting to learn shortest paths. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000671419Publication status
publishedExternal links
Search print copy at ETH Library
Publisher
ETH ZurichSubject
Graph Neural Networks (GNNs)Organisational unit
03604 - Wattenhofer, Roger / Wattenhofer, Roger
More
Show all metadata
ETH Bibliography
yes
Altmetrics