Identifies the 'golden subspace' for test-time adaptation, enabling extreme efficiency in online model updates.
March 24, 2026
Original Paper
The Golden Subspace: Where Efficiency Meets Generalization in Continual Test-Time Adaptation
arXiv · 2603.21928
The Takeaway
It theoretically proves that updating only the features projected onto the classifier's row space is sufficient for effective adaptation. This allows for 'Guided Online Low-rank Directional' (GOLD) adaptation which is far more efficient and stable than traditional methods that update large swathes of network weights.
From the abstract
Continual Test-Time Adaptation (CTTA) aims to enable models to adapt online to unlabeled data streams under distribution shift without accessing source data. Existing CTTA methods face an efficiency-generalization trade-off: updating more parameters improves adaptation but severely reduces online inference efficiency. An ideal solution is to achieve comparable adaptation with minimal feature updates; we call this minimal subspace the golden subspace. We prove its existence in a single-step adapt