economics First Ever

AI 'backdoor' hacks can be programmed to stay dormant for weeks, only triggering after the AI has been used a specific number of times.

April 1, 2026

Original Paper

Delayed Backdoor Attacks: Exploring the Temporal Dimension as a New Attack Surface in Pre-Trained Models

Zikang Ding, Haomiao Yang, Meng Hao, Wenbo Jiang, Kunlan Xiang, Runmeng Du, Yijing Liu, Ruichen Zhang, Dusit Niyato

SSRN · 6506542

The Takeaway

Most cybersecurity assumes a hack happens immediately when a 'trigger' word is used. This research proves that attackers can use common everyday words as triggers that only activate after a long delay, making the hack nearly impossible to detect with current safety tests.

From the abstract

Backdoor attacks against pre-trained models (PTMs) have traditionally operated under an ”immediacy assumption,” where malicious behavior manifests instantly upon trigger occurrence. This work revisits and challenges this paradigm by introducing Delayed Backdoor Attacks (DBA), a new class of threats in which activation is temporally decoupled from trigger exposure. We propose that this temporal dimension is the key to unlocking a previously infeasible class of attacks: those that use common, ever