IQuest-Coder-V1 introduces a series of high-performance code models including a unique 'Loop' variant with a recurrent mechanism for efficiency.
arXiv · March 18, 2026 · 2603.16733
The Takeaway
This release provides open-weights models that rival proprietary models in agentic software engineering and tool use. The inclusion of 'thinking paths' (reasoning RL) and a recurrent architectural variant offers new paths for researchers looking to balance deployment footprint with complex logical capacity.
From the abstract
In this report, we introduce the IQuest-Coder-V1 series-(7B/14B/40B/40B-Loop), a new family of code large language models (LLMs). Moving beyond static code representations, we propose the code-flow multi-stage training paradigm, which captures the dynamic evolution of software logic through different phases of the pipeline. Our models are developed through the evolutionary pipeline, starting with the initial pre-training consisting of code facts, repository, and completion data. Following that,