Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Navigating and Addressing Data Problems for Foundation Models (DPFM)

CollabEdit: Towards Non-destructive Collaborative Knowledge Editing

Jiamu Zheng · Jinghuai Zhang · Futing Wang · Tianyu Du · Tao Lin

Keywords: [ collaborative learning ] [ Knowledge Editing ]


Abstract:

Recently, collaborative fine-tuning large language models (LLMs) has emergedas a new paradigm for utilizing private data from different parties in a mannerthat guarantees both efficiency and privacy. Meanwhile, the practical needs of the“right to be forgotten” and the frequent demands to update outdated information,have led to a burgeoning in the techniques of knowledge editing (KE) for LLMs.However, current KE methods are all designed for a single model, and directlyadapting current KE methods to collaborative learning scenarios encounters severeperformance decreases. In this study, we propose a non-destructive collaborativeknowledge editing framework COLLABEDIT that utilizes novel model fusion strategy to preserve overall editing performance. Empirical studies on two canonicaldatasets demonstrate the effectiveness and superiority of our method comparedwith other destructive baselines.

Chat is not available.