Entropic Memory: A Thermodynamics-Inspired Consolidation Mechanism for Lifelong Agent Learning
Jing Du ⋅ Hang Zhao
Abstract
Large language model (LLM) agents often degrade over long interaction streams because memory accumulates noisy observations that reduce retrieval quality. We propose Entropic Memory, a two-tier memory consolidation mechanism that periodically transfers information from a hot working buffer into a cold long-term store. The method uses a free-energy objective to balance utility against embedding entropy, together with a temperature-controlled stochastic replacement rule. In the controlled Infinite Room environment under a fixed memory budget, Entropic Memory matches greedy importance sampling at 30\% noise ($SR \approx 0.29$) and improves survival rate from $0.24$ to $0.28$ at 50\% noise (+15\% relative). Overall, these results indicate that entropy-aware consolidation improves robustness to distractors in this controlled continual-memory setting.
Successful Page Load