GIST achieves O(N) complexity for Graph Transformers while maintaining gauge invariance, enabling scaling to meshes with 750K nodes.
March 18, 2026
Original Paper
GIST: Gauge-Invariant Spectral Transformers for Scalable Graph Neural Operators
arXiv · 2603.16849
The Takeaway
It solves the long-standing tradeoff between computational efficiency and geometric symmetry in graph learning. This allows Neural PDE solvers and aerodynamic predictors to scale to industrial-sized meshes without losing inductive generalization.
From the abstract
Adapting transformer positional encoding to meshes and graph-structured data presents significant computational challenges: exact spectral methods require cubic-complexity eigendecomposition and can inadvertently break gauge invariance through numerical solver artifacts, while efficient approximate methods sacrifice gauge symmetry by design. Both failure modes cause catastrophic generalization in inductive learning, where models trained with one set of numerical choices fail when encountering di