Rethinking the Expressive Power of GNNs via Graph Biconnectivity
ICLR• 2023
Abstract
Designing expressive Graph Neural Networks (GNNs) is a central topic in
learning graph-structured data. While numerous approaches have been proposed to
improve GNNs in terms of the Weisfeiler-Lehman (WL) test, generally there is
still a lack of deep understanding of what additional power they can
systematically and provably gain. In this paper, we take a fundamentally
different perspective to study the expressive power of GNNs beyond the WL test.
Specifically, we introduce a novel class of expressivity metrics via graph
biconnectivity and highlight their importance in both theory and practice. As
biconnectivity can be easily calculated using simple algorithms that have
linear computational costs, it is natural to expect that popular GNNs can learn
it easily as well. However, after a thorough review of prior GNN architectures,
we surprisingly find that most of them are not expressive for any of these
metrics. The only exception is the ESAN framework, for which we give a
theoretical justification of its power. We proceed to introduce a principled
and more efficient approach, called the Generalized Distance Weisfeiler-Lehman
(GD-WL), which is provably expressive for all biconnectivity metrics.
Practically, we show GD-WL can be implemented by a Transformer-like
architecture that preserves expressiveness and enjoys full parallelizability. A
set of experiments on both synthetic and real datasets demonstrates that our
approach can consistently outperform prior GNN architectures.