AI & ML Efficiency Breakthrough

A 140M-parameter networking foundation model (PLUME) that outperforms frontier LLMs on protocol analysis by learning from native packet structures.

March 17, 2026

Original Paper

PLUME: Building a Network-Native Foundation Model for Wireless Traces via Protocol-Aware Tokenization

Swadhin Pradhan, Shazal Irshad, Jerome Henry

arXiv · 2603.13647

The Takeaway

By using protocol-aware tokenization rather than generic BPE, this model achieves better-than-GPT-5 performance on network failure detection with 600x fewer parameters. It demonstrates that domain-native structure is more important than raw scale for specialized engineering tasks, enabling privacy-preserving on-prem RCA.

From the abstract

Foundation models succeed when they learn in the native structure of a modality, whether morphology-respecting tokens in language or pixels in vision. Wireless packet traces deserve the same treatment: meaning emerges from layered headers, typed fields, timing gaps, and cross-packet state machines, not flat strings. We present Plume (Protocol Language Understanding Model for Exchanges), a compact 140M-parameter foundation model for 802.11 traces that learns from structured PDML dissections. A pr