Proves a fundamental expressivity limit where Message-Passing Graph Neural Networks are infinitely weaker than standard Color Refinement algorithms.
arXiv · March 17, 2026 · 2603.14846
The Takeaway
It shows that MP-GNNs only induce a polynomial number of equivalence classes while non-isomorphic graphs grow doubly exponentially. This refutes the common belief that MP-GNNs effectively match the power of the 1-WL test in a uniform sense.
From the abstract
We define a generic class of functions that captures most conceivable aggregations for Message-Passing Graph Neural Networks (MP-GNNs), and prove that any MP-GNN model with such aggregations induces only a polynomial number of equivalence classes on all graphs - while the number of non-isomorphic graphs is doubly-exponential (in number of vertices).Adding a familiar perspective, we observe that merely 2-iterations of Color Refinement (CR) induce at least an exponential number of equivalence clas