Pedestrians are intentionally bullying self-driving cars into a game of chicken because they know the vehicle is hard-coded to never hit them.
April 29, 2026
Original Paper
Pedestrians play chicken with an autonomous vehicle
arXiv · 2604.24384
The Takeaway
Pedestrians on city streets are actively exploiting the safety algorithms of autonomous vehicles to jump ahead in traffic. These people treat every encounter as a game of chicken, knowing the car has no choice but to yield unconditionally. This behavior creates a paradox where the car's perfect safety record makes it nearly impossible for the vehicle to move through a crowded intersection. Traffic flow breaks down because the machine is too polite to hold its ground against a human. Engineers might have to program cars to act slightly more aggressive or unpredictable to regain their right of way. This shift means the future of self-driving tech depends on machines learning how to bluff or threaten humans rather than just being safe.
From the abstract
Automated vehicles (AVs) are commonly programmed to yield unconditionally to pedestrians in the interest of safety. However, this design choice can give rise to the Freezing Robot Problem in which pedestrians learn to assert priority at every interaction, causing vehicles to stall and make no progress. The game theoretic Sequential Chicken model has shown that, like human drivers, AVs can resolve this problem by trading credible threats of very small risks of collision or larger risks of less se