AI & ML Efficiency Breakthrough

Proposes a 'no-backprop' stochastic process memory for edge agents that solves the retention-forgetting tradeoff with fixed compute.

April 2, 2026

Original Paper

Temporal Memory for Resource-Constrained Agents: Continual Learning via Stochastic Compress-Add-Smooth

Michael Chertkov

arXiv · 2604.00067

The Takeaway

Ideal for resource-constrained hardware, this method treats memory as a Bridge Diffusion process. It allows agents to incorporate new experiences and manage long-term forgetting without the heavy computational cost of neural network updates.

From the abstract

An agent that operates sequentially must incorporate new experience without forgetting old experience, under a fixed memory budget. We propose a framework in which memory is not a parameter vector but a stochastic process: a Bridge Diffusion on a replay interval $[0,1]$, whose terminal marginal encodes the present and whose intermediate marginals encode the past. New experience is incorporated via a three-step \emph{Compress--Add--Smooth} (CAS) recursion. We test the framework on the class of mo