Poster
in
Workshop: Modular, Collaborative and Decentralized Deep Learning
Rethinking Decentralized Learning: Towards More Realistic Evaluations with a Metadata-Agnostic Approach
Tianyu Zhang · Lu Li · Tongtian Zhu · Suyuchen Wang · Can Wang · Yong Chen
Decentralized learning has been regarded as a privacy-preserving training paradigm that enables distributed model training without exposing raw data. However, many experimental settings in decentralized learning research assume metadata awareness among participants, which contradicts real-world constraints where participants lack shared metadata knowledge. We distinguish between Metadata-Dependent Supervised Learning (MDSL), which assumes global metadata synchronization, and Metadata-Agnostic Zero-Shot Learning (MAZEL), where participants do not share metadata. Our contributions are (1) highlight the difference between MAZEL and MDSL; (2) present empirical evidence demonstrating that long-held claims of MDSL-based decentralized learning may not hold under MAZEL settings; (3) provide benchmarks using up to 8–16 diverse datasets to rigorously evaluate newly proposed decentralized methods under real metadata-agnostic cases; and (4) propose two-stage and cosine gossip schedulers to optimize communication efficiency. Our code is available at: https://anonymous.4open.science/r/More-Realistic-Evaluations.