Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Neuron-based Multifractal Analysis of Neuron Interaction Dynamics in Large Models

Xiongye Xiao · Heng Ping · Chenyu Zhou · Defu Cao · Yaxing Li · Yi-Zhuo Zhou · Shixuan Li · Nikos Kanakaris · Paul Bogdan

Hall 3 + Hall 2B #284
[ ] [ Project Page ]
Fri 25 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract:

In recent years, there has been increasing attention on the capabilities of large-scale models, particularly in handling complex tasks that small-scale models are unable to perform. Notably, large language models (LLMs) have demonstrated intelligent'' abilities such as complex reasoning and abstract language comprehension, reflecting cognitive-like behaviors. However, current research on emergent abilities in large models predominantly focuses on the relationship between model performance and size, leaving a significant gap in the systematic quantitative analysis of the internal structures and mechanisms driving these emergent abilities. Drawing inspiration from neuroscience research on brain network structure and self-organization, we propose (i) a general network representation of large models, (ii) a new analytical framework — Neuron-based Multifractal Analysis (NeuroMFA) - for structural analysis, and (iii) a novel structure-based metric as a proxy for emergent abilities of large models. By linking structural features to the capabilities of large models, NeuroMFA provides a quantitative framework for analyzing emergent phenomena in large models. Our experiments show that the proposed method yields a comprehensive measure of the network's evolving heterogeneity and organization, offering theoretical foundations and a new perspective for investigating emergence in large models.

Live content is unavailable. Log in and register to view live content