AI & ML Paradigm Shift

Replaces self-attention with Reaction-Diffusion PDEs as the predictive engine for world models.

March 24, 2026

Original Paper

FluidWorld: Reaction-Diffusion Dynamics as a Predictive Substrate for World Models

Fabien Polly

arXiv · 2603.21315

The Takeaway

FluidWorld demonstrates that Transformer-based world models are not the only (or best) way to predict future states; PDE-governed dynamics preserve 10-15% more spatial structure and maintain coherent multi-step rollouts where Transformers and LSTMs typically collapse.

From the abstract

World models learn to predict future states of an environment, enabling planning and mental simulation. Current approaches default to Transformer-based predictors operating in learned latent spaces. This comes at a cost: O(N^2) computation and no explicit spatial inductive bias. This paper asks a foundational question: is self-attention necessary for predictive world modeling, or can alternative computational substrates achieve comparable or superior results? I introduce FluidWorld, a proof-of-c