Free Sinewich enables parameter-efficient multi-task learning using frequency-based weight modulation with near-zero overhead.
March 24, 2026
Original Paper
Frequency Switching Mechanism for Parameter-E!cient Multi-Task Learning
arXiv · 2603.21111
The Takeaway
It allows a single backbone to serve multiple specialized tasks by switching sinusoidal frequencies rather than storing massive separate adapter weights. It achieves state-of-the-art performance on dense prediction tasks with significantly fewer trainable parameters.
From the abstract
Multi-task learning (MTL) aims to enable a single model to solve multiple tasks efficiently; however, current parameter-efficient fine-tuning (PEFT) methods remain largely limited to single-task adaptation. We introduce \textbf{Free Sinewich}, a parameter-efficient multi-task learning framework that enables near-zero-cost weight modulation via frequency switching (\textbf{Free}). Specifically, a \textbf{Sine-AWB (Sinewich)} layer combines low-rank factors and convolutional priors into a single k