AI & ML Paradigm Shift

Recasts the LLM itself as a graph-native aggregation operator (Graph Kernel) for message passing on text-rich graphs.

March 17, 2026

Original Paper

LLM as Graph Kernel: Rethinking Message Passing on Text-Rich Graphs

Ying Zhang, Hang Yu, Haipeng Zhang, Peng Di

arXiv · 2603.14937

The Takeaway

Instead of using LLMs to generate static embeddings for a separate GNN, this approach (RAMP) performs message passing directly in the raw text space. This removes the information bottleneck and allows for joint optimization of structural and semantic reasoning.

From the abstract

Text-rich graphs, which integrate complex structural dependencies with abundant textual information, are ubiquitous yet remain challenging for existing learning paradigms. Conventional methods and even LLM-hybrids compress rich text into static embeddings or summaries before structural reasoning, creating an information bottleneck and detaching updates from the raw content. We argue that in text-rich graphs, the text is not merely a node attribute but the primary medium through which structural