EPFLとKAISTの研究者は、水素貯蔵などの用途で有望な材料である有機金属フレームワークの理解を大幅に向上させる新しいAIモデルを開発しました。 Researchers at EPFL and KAIST have developed a new AI model that significantly improves the understanding of metal-organic frameworks, promising materials for hydrogen storage and other applications.
2023-03-14 スイス連邦工科大学ローザンヌ校(EPFL)
彼らは、MOFTransformerと呼ばれる変成器(トランスフォーマー)を作成し、これはMOFを研究する研究者のためのChatGPTのようなものです。このモデルは、大量のテキストで事前トレーニングされているため、従来の機械学習方法よりも少ないデータで結果を出すことができます。研究者たちは、MOFTransformerがMOFの新しい改良された特性を持つ新しい材料の開発の道を切り拓くことを期待しています。
<関連情報>
- https://actu.epfl.ch/news/new-ai-model-transforms-research-on-metal-organic-/
- https://www.nature.com/articles/s42256-023-00628-2
有機金属骨格における普遍的な転移学習のためのマルチモーダルな事前学習トランスフォーマー A multi-modal pre-training transformer for universal transfer learning in metal–organic frameworks
Yeonghun Kang,Hyunsoo Park,Berend Smit & Jihan Kim
Nature Machine Intelligence Published:13 March 2023
DOI:https://doi.org/10.1038/s42256-023-00628-2
Abstract
Metal–organic frameworks (MOFs) are a class of crystalline porous materials that exhibit a vast chemical space owing to their tunable molecular building blocks with diverse topologies. An unlimited number of MOFs can, in principle, be synthesized. Machine learning approaches can help to explore this vast chemical space by identifying optimal candidates with desired properties from structure–property relationships. Here we introduce MOFTransformer, a multi-modal Transformer encoder pre-trained with 1 million hypothetical MOFs. This multi-modal model utilizes integrated atom-based graph and energy-grid embeddings to capture both local and global features of MOFs, respectively. By fine-tuning the pre-trained model with small datasets ranging from 5,000 to 20,000 MOFs, our model achieves state-of-the-art results for predicting across various properties including gas adsorption, diffusion, electronic properties, and even text-mined data. Beyond its universal transfer learning capabilities, MOFTransformer generates chemical insights by analyzing feature importance through attention scores within the self-attention layers. As such, this model can serve as a platform for other MOF researchers that seek to develop new machine learning models for their work.