Cheap smartphone sensors can see through walls by tracking tiny bounces of light.
April 20, 2026
Original Paper
DENALI: A Dataset Enabling Non-Line-of-Sight Spatial Reasoning with Low-Cost LiDARs
arXiv · 2604.16201
The Takeaway
Consumer-grade LiDAR sensors can perform non-line-of-sight spatial reasoning by analyzing multi-bounce light returns. This technique uses time-resolved histograms to reconstruct the shape of things the sensor cannot directly see. Standard software usually throws this data away as noise, but the DENALI dataset proves it contains a wealth of hidden information. This turns every robot and phone with a basic laser into a device that can peer around corners. We no longer need million-dollar laboratory equipment to see into hidden spaces.
From the abstract
Consumer LiDARs in mobile devices and robots typically output a single depth value per pixel. Yet internally, they record full time-resolved histograms containing direct and multi-bounce light returns; these multi-bounce returns encode rich non-line-of-sight (NLOS) cues that can enable perception of hidden objects in a scene. However, severe hardware limitations of consumer LiDARs make NLOS reconstruction with conventional methods difficult. In this work, we motivate a complementary direction: e