Challenges the assumption that architecture and loss are the primary levers for neural simulators by proving the 'carried state' design is the dominant bottleneck.
April 1, 2026
Original Paper
Derived Fields Preserve Fine-Scale Detail in Budgeted Neural Simulators
arXiv · 2603.29224
The Takeaway
It shows that fine-scale detail loss in budgeted simulations occurs before training starts, during the state construction phase. By optimizing which physical fields are carried (DerivOpt), practitioners can significantly improve rollout fidelity and high-frequency error without increasing compute or changing model architectures.
From the abstract
Fine-scale-faithful neural simulation under fixed storage budgets remains challenging. Many existing methods reduce high-frequency error by improving architectures, training objectives, or rollout strategies. However, under budgeted coarsen-quantize-decode pipelines, fine detail can already be lost when the carried state is constructed. In the canonical periodic incompressible Navier-Stokes setting, we show that primitive and derived fields undergo systematically different retained-band distorti