Keywords: [ mixture of experts ]
Quantitative structure-activity relationships (QSAR) models have been used for decades to predict the activity of small molecules, using encodings of the molecular structure, for which simple 2D descriptors of the molecular graph are still most commonly used. One of the recurrent problems of QSAR models is that relationships observed for a specific scaffold (pruned molecular skeleton) are sometimes not translatable to another, due to the 3D flexibility of molecular objects. This is also true when building multitask networks predicting the activity against several proteins at the same time - sometimes single protein models work better, and adding dissimilar proteins into the model decreases performance. In this paper, mixtures of experts (MoE) are used to combine a global network and local structures of the dataset (e.g. molecular scaffold, single protein) in the single task or multitask framework. We show that structuring the learning process with protein or chemical series information can enhance model performance and provide a built-in model introspection tool.