Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Cut the Crap: An Economical Communication Pipeline for LLM-based Multi-Agent Systems

Guibin Zhang · Yanwei Yue · Zhixun Li · Sukwon Yun · Guancheng Wan · Kun Wang · Dawei Cheng · Jeffrey Yu · Tianlong Chen

Hall 3 + Hall 2B #252
[ ]
Thu 24 Apr midnight PDT — 2:30 a.m. PDT

Abstract: Recent advancements in large language model (LLM)-powered agents have shown that collective intelligence can significantly outperform individual capabilities, largely attributed to the meticulously designed inter-agent communication topologies. Though impressive in performance, existing multi-agent pipelines inherently introduce substantial token overhead, as well as increased economic costs, which pose challenges for their large-scale deployments. In response to this challenge, we propose an economical, simple, and robust multi-agent communication framework, termed AgentPrune, which can seamlessly integrate into mainstream multi-agent systems and prunes redundant or even malicious communication messages. Technically, AgentPrune is the first to identify and formally define the Communication Redundancy issue present in current LLM-based multi-agent pipelines, and efficiently performs one-shot pruning on the spatial-temporal message-passing graph, yielding a token-economic and high-performing communication topology.Extensive experiments across six benchmarks demonstrate that AgentPrune (I) achieves comparable results as state-of-the-art topologies at merely 5.6costcomparedtotheir\$43.7,\textbf{(II)}integratesseamlesslyintoexistingmultiagentframeworkswith28.1\\%\sim72.8\\%\downarrowtokenreduction,and\textbf{(III)}successfullydefendagainsttwotypesofagentbasedadversarialattackswith3.5\\%\sim10.8\\%\uparrow$ performance boost.

Live content is unavailable. Log in and register to view live content