Graph neural networks (GNNs) have traditionally been classified as message passing networks (MPNNs) and spectral graph networks, representing different approaches in machine learning and signal processing. Antonis Vasileiou of RWTH Aachen University, Juan Cervino of Massachusetts Institute of Technology, and Pascal Frossard of École Polytechnique de Lausanne, along with Kanatsoulis and colleagues, demonstrated that this separation has little basis and is hindering progress in the field. Their work presents a new perspective that frames both MPNNs and spectral GNNs as different parameters of permutation equivariant operators applied to graph signals. This reveals that many existing architectures have comparable expressive power and only under certain conditions significant differences emerge. Additionally, the researchers point out that MPNNs excel at analyzing discrete structure and expressiveness, whereas spectral GNNs provide tools for understanding smoothing and stability, highlighting the complementary strengths of each approach. Ultimately, this study proposes a unified theoretical framework to accelerate progress in graph learning by recognizing the fundamental connection between these two GNN paradigms.
Message passing and spectral methods express equivalence as permutation equivariant operators on a graph.
Researchers have uncovered a unifying perspective on graph neural networks and demonstrated that commonly separated approaches, message passing and spectral techniques, are fundamentally connected. In this study, we hypothesize that the perceived gap between these two traditions is largely artificial and hinders progress in the field of graph learning.
By framing both message-passing neural networks and spectral graph neural networks as different ways of representing permutation equivariant operators acting on graph signals, scientists have found that their expressive power is surprisingly comparable. This new perspective suggests that many common architectures are more similar than previously thought, and that true differences emerge only under certain conditions.
This study highlights that message transmission networks are excellent at capturing discrete structure and facilitating expressiveness analysis with tools borrowed from logic and graph isomorphism research. Conversely, spectral graph neural networks provide a principled method for understanding important graph properties such as smoothing, bottlenecks, stability, and community structure.
This complementary strength suggests that progress in graph learning will be accelerated by recognizing these similarities and differences and promoting a unified theoretical framework. Rather than treating these approaches as competing paradigms, this study advocates a consistent understanding of their common foundations.
Specifically, this study shows that the aggregation of information in the message passing layer can often be interpreted as a local action of a graph shift operator. Similarly, the global action of a spectral filter, which is the heart of a spectral GNN, can often be implemented or approximated by a sufficiently deep message passing network.
This finding highlights the potential for cross-pollination of ideas and techniques between the two communities. This study further proposes a clear separation between spectral filtering and spectral position encoding, suggesting that the latter should be considered as a design choice that can be applied to both types of GNNs and may improve performance.
Ultimately, this position paper argues for a conceptual shift in the way graph neural networks are understood and developed. By providing a unified language and clarifying the relationship between message passing and spectral methods, the researchers aim to lead the field toward more principled and effective approaches to learning from graph data, with implications for applications as diverse as optimization and weather forecasting. This study establishes the foundation for future research focused on leveraging the strengths of both paradigms in combination.
Permutation homoscedasticity and expressive power of graph neural networks
Researchers argue that the distinction between message-passing graph neural networks (MPNNs) and spectral graph neural networks (GNNs) is largely artificial and hinders progress in the graph learning field. In this study, we frame both MPNNs and spectral GNNs as different parameterizations of permutation equivariant operators operating on graph signals and establish a unifying perspective.
Our research reveals that many commonly used architectures exhibit comparable expressiveness, with real differences only emerging under certain conditions. This study highlights the complementary strengths of each approach, noting that MPNNs provide a natural framework for analyzing discrete structure and expressiveness using tools from logic and graph isomorphism research.
Conversely, the spectral perspective provides a principled method for understanding smoothing effects, identifying bottlenecks, assessing stability, and characterizing community structure within graphs. This analysis included a theoretical re-evaluation of existing architectures and demonstrated their relationships through the lens of permutation equivariant operators.
Additionally, this study highlights the potential for accelerating progress by recognizing important similarities and differences between these two GNN types. This study aims to foster innovation in graph learning by proposing a unified theoretical and conceptual framework rather than treating them as competing paradigms. This study does not introduce a new architecture, but rather provides a new perspective for interpreting and comparing existing methods, ultimately promoting a more consistent understanding of the field.
Permutation equivariance integrates message passing and spectral graph neural networks
The researchers detail a framework that understands both message passing neural networks and spectral graph neural networks as parameterizations of permutation equivariant operators on graph signals. This research proves that many common architectures are equally expressive, and that true differences only arise in specific operational regimes.
Message-passing neural networks leverage tools from logic and graph isomorphism research to provide a natural language for analyzing discrete structure and expressiveness. This provides a principled understanding of the structural properties that these networks can and cannot capture, leading to the development of more expressive architectures based on k-dimensional Weissfeiler, Leman algorithms, and subgraph numbers.
In this work, we focus on seamlessly extending message-passing neural networks to directed graphs and linking prediction tasks through a well-understood structure, in contrast to the development of more cumbersome spectral GNNs for similar applications. Additionally, the local update mechanism of message-passing neural networks is closely aligned with established graph algorithms and has demonstrated the ability to represent and learn algorithms from data, such as solving linear programs and approximating hard maximum constraint satisfaction problems.
From a systems perspective, the explicit aggregation and update structure of message passing neural networks facilitates large-scale distributed graph learning, especially in databases and recommender systems. Spectral graph neural networks provide a unique abstraction for understanding information smoothing, bottlenecks, and community structure in graphs.
Oversmoothing, observed as a reduction in node embeddings, is characterized as spectral reduction, where repeated low-pass filtering collapses features into a range of slowest decaying eigenvectors. The frequency response of a spectral filter clearly shows the spectral shrinkage rate and role of the graph, revealing the trade-off between depth and shrinkage. Bottlenecks manifesting as poor algebraic connectivity or reduced conductance are identified by measures such as balanced Forman curvature, effective resistance, and commuting time, providing computable proxies for over-squashing.
Equivalence of message passing and spectral graph neural networks
Researchers have demonstrated that the distinction between message passing graph neural networks (MPNNs) and spectral graph neural networks is largely artificial and hinders progress in the field. This study proposes a unified perspective that frames both types of GNNs as different ways to represent permutation equivariant operators acting on graph signals.
Many commonly used architectures exhibit comparable expressive power, and true differences only emerge under certain conditions. MPNNs are excellent at representing discrete structures and allow expressiveness analysis using tools from logic and graph isomorphism studies. Conversely, the spectral perspective provides a principled way to understand smoothing, bottlenecks, stability, and community structure within graphs.
Generalization analyzes for both MPNNs and spectral GNNs rely on similar assumptions, such as bounded node features and Lipschitz continuous functions, resulting in comparable performance bounds. Spectral position encoding is often linked to spectral GNNs and is presented as an input representation choice independent of the underlying network architecture.
The authors acknowledge the limitations and point out that their analysis relies on assumptions about operator behavior and graph properties. Future research should focus on further integrating these perspectives within a common theoretical framework rather than treating them as competing paradigms. This integrated approach promises to accelerate progress in graph learning by leveraging the complementary strengths of both message-passing and spectral methods, ultimately improving model performance and understanding of graph-structured data.
