2024-01-18 デューク大学(Duke)
<関連情報>
- https://pratt.duke.edu/news/machine-learning-models-teach-each-other/
- https://www.sciencedirect.com/science/article/pii/S2667318523000338
分子データ科学における横並び学習 Yoked learning in molecular data science
Zhixiong Li, Yan Xiang, Yujing Wen, Daniel Reker
Artificial Intelligence in the Life Sciences
DOI:https://doi.org/10.1016/j.ailsci.2023.100089
Abstract
Active machine learning is an established and increasingly popular experimental design technique where the machine learning model can request additional data to improve the model’s predictive performance. It is generally assumed that this data is optimal for the machine learning model since it relies on the model’s predictions or model architecture and therefore cannot be transferred to other models. Inspired by research in pedagogy, we here introduce the concept of yoked machine learning where a second machine learning model learns from the data selected by another model. We found that in 48% of the benchmarked combinations, yoked learning performed similar or better than active learning. We analyze distinct cases in which yoked learning can improve active learning performance. In particular, we prototype yoked deep learning (YoDeL) where a classic machine learning model provides data to a deep neural network, thereby mitigating challenges of active deep learning such as slow refitting time per learning iteration and poor performance on small datasets. In summary, we expect the new concept of yoked (deep) learning to provide a competitive option to boost the performance of active learning and benefit from distinct capabilities of multiple machine learning models during data acquisition, training, and deployment.