A transformer-based meta-amortized framework that allows simulation-based inference to remain valid across different model structures without retraining.
March 24, 2026
Original Paper
CogFormer: Learn All Your Models Once
arXiv · 2603.20520
The Takeaway
Standard simulation-based inference (SBI) requires training a new neural network every time a modeler changes a parameterization or prior. This framework enables a single model to handle a combinatorial number of structurally different cognitive models, drastically accelerating the research cycle for complex model estimation.
From the abstract
Simulation-based inference (SBI) with neural networks has accelerated and transformed cognitive modeling workflows. SBI enables modelers to fit complex models that were previously difficult or impossible to estimate, while also allowing rapid estimation across large numbers of datasets. However, the utility of SBI for iterating over varying modeling assumptions remains limited: changing parameterizations, generative functions, priors, and design variables all necessitate model retraining and hen