Skip to yearly menu bar Skip to main content


Poster

MMDT: Decoding the Trustworthiness and Safety of Multimodal Foundation Models

Chejian Xu · Jiawei Zhang · Zhaorun Chen · Chulin Xie · Mintong Kang · Yujin Potter · Zhun Wang · Zhuowen Yuan · Alexander Xiong · Zidi Xiong · Chenhui Zhang · Lingzhi Yuan · Yi Zeng · Peiyang Xu · Chengquan Guo · Andy Zhou · Jeffrey Tan · Xuandong Zhao · Francesco Pinto · Zhen Xiang · Yu Gai · Zinan Lin · Dan Hendrycks · Bo Li · Dawn Song

Hall 3 + Hall 2B #288
[ ] [ Project Page ]
Fri 25 Apr 7 p.m. PDT — 9:30 p.m. PDT

Abstract:

Multimodal foundation models (MMFMs) play a crucial role in various applications, including autonomous driving, healthcare, and virtual assistants. However, several studies have revealed vulnerabilities in these models, such as generating unsafe content by text-to-image models. Existing benchmarks on multimodal models either predominantly assess the helpfulness of these models, or only focus on limited perspectives such as fairness and privacy. In this paper, we present the first unified platform, MMDT (Multimodal DecodingTrust), designed to provide a comprehensive safety and trustworthiness evaluation for MMFMs. Our platform assesses models from multiple perspectives, including safety, hallucination, fairness/bias, privacy, adversarial robustness, and out-of-distribution (OOD) generalization. We have designed various evaluation scenarios and red teaming algorithms under different tasks for each perspective to generate challenging data, forming a high-quality benchmark. We evaluate a range of multimodal models using MMDT, and our findings reveal a series of vulnerabilities and areas for improvement across these perspectives. This work introduces the first comprehensive and unique safety and trustworthiness evaluation platform for MMFMs, paving the way for developing safer and more reliable MMFMs and systems. Our platform and benchmark are available at https://mmdecodingtrust.github.io/.

Live content is unavailable. Log in and register to view live content