AI & ML New Capability

An 'invariant compiler' uses LLMs to translate physics requirements into Neural ODE architectures that satisfy conservation laws by construction.

March 26, 2026

Original Paper

An Invariant Compiler for Neural ODEs in AI-Accelerated Scientific Simulation

Fangzhou Yu, Yiqi Su, Ray Lee, Shenfeng Cheng, Naren Ramakrishnan

arXiv · 2603.23861

The Takeaway

Instead of using 'soft' loss penalties that can still drift, this framework enforces domain invariants (like energy conservation) through the architecture itself. It provides a systematic design pattern for creating physically grounded neural surrogates for scientific simulation.

From the abstract

Neural ODEs are increasingly used as continuous-time models for scientific and sensor data, but unconstrained neural ODEs can drift and violate domain invariants (e.g., conservation laws), yielding physically implausible solutions. In turn, this can compound error in long-horizon prediction and surrogate simulation. Existing solutions typically aim to enforce invariance by soft penalties or other forms of regularization, which can reduce overall error but do not guarantee that trajectories will