For 30 years, we didn't know the absolute limit of how much a machine can learn. Someone just finally cracked the code.
March 26, 2026
Original Paper
Labeled Compression Schemes for Concept Classes of Finite Functions
arXiv · 2603.23561
AI-generated illustration
The Takeaway
For decades, researchers didn't know if there was a universal limit to how much you could compress a dataset without an AI losing its ability to learn. This paper finally proves the 'Sample Compression Conjecture,' a foundational rule that defines exactly how much data is truly necessary for any learning task.
From the abstract
The sample compression conjecture is: Each concept class of VC dimension d has a compression scheme of sizethis http URLthis paper, for any concept class of finite functions, we present a labeled sample compression scheme of size equals to its VC dimension d. That is, the long standing open sample compression conjecture is resolved.