Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Multi-level Protein Structure Pre-training via Prompt Learning

Zeyuan Wang · Qiang Zhang · Shuang-Wei HU · Haoran Yu · Xurui Jin · Zhichen Gong · Huajun Chen

Keywords: [ multi-task learning ] [ prompt learning ] [ multi-level structure ] [ protein representation learning ] [ Machine Learning for Sciences ]


Abstract:

A protein can focus on different structure levels to implement its functions. Each structure has its own merit and driving forces in describing some specific characteristics, and they cannot replace each other. Most existing function prediction methods take the tertiary structure as input, unintentionally ignoring the other levels of protein structures. Considering protein sequences can determine multi-level structures, in this paper, we aim to realize the comprehensive potential of protein sequences for function prediction. Specifically, we propose a new prompt-guided multi-task pre-training and fine-tuning framework, and the resulting protein model is called PromptProtein. Through the prompt-guided multi-task pre-training, we learn multiple prompt signals to steer the model to focus on different structure levels. We also design a prompt fine-tuning module to provide downstream tasks the on-demand flexibility of utilizing respective levels of structure information. Extensive experiments on function prediction and protein engineering show that PromptProtein outperforms state-of-the-art methods by large margins.

Chat is not available.