There is a mathematical 'wall' that makes it impossible for complex AIs to communicate with simpler ones.
April 15, 2026
Original Paper
Semantic Rate-Distortion for Bounded Multi-Agent Communication: Capacity-Derived Semantic Spaces and the Communication Cost of Alignment
arXiv · 2604.09521
The Takeaway
We usually think communication is just a matter of the right protocol, but this paper proves a structural phase transition exists in multi-agent systems. If the difference in computational capacity between two agents is too large, intent-preserving communication becomes mathematically impossible below a specific 'rate.' No matter how hard you try to 'align' them, the simpler agent cannot capture the intent of the complex one. This sets a hard limit on how well we can control super-intelligent systems using low-bandwidth interfaces. It's a mathematical proof of the 'complexity gap.'
From the abstract
When two agents of different computational capacities interact with the same environment, they need not compress a common semantic alphabet differently; they can induce different semantic alphabets altogether. We show that the quotient POMDP $Q_{m,T}(M)$ - the unique coarsest abstraction consistent with an agent's capacity - serves as a capacity-derived semantic space for any bounded agent, and that communication between heterogeneous agents exhibits a sharp structural phase transition. Below a