Abstract
In this big data era, the use of large dataset in conjunction with machine learning (ML) has been increasingly popular in both industry and academia. In recent times, the field of materials science is also undergoing a big data revolution, with large database and repositories appearing everywhere. Traditionally, materials science is a trial-and-error field, in both the computational and experimental departments. With the advent of machine learning-based techniques, there has been a paradigm shift: materials can now be screened quickly using ML models and even generated based on materials with similar properties; ML has also quietly infiltrated many sub-disciplinary under materials science. However, ML remains relatively new to the field and is expanding its wing quickly. There are a plethora of readily-available big data architectures and abundance of ML models and software; The call to integrate all these elements in a comprehensive research procedure is becoming an important direction of material science research. In this review, we attempt to provide an introduction and reference of ML to materials scientists, covering as much as possible the commonly used methods and applications, and discussing the future possibilities.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References and notes
E. Weinan, Machine learning and computational mathematics, Commun. Comput. Phys. 28, 1639 (2020)
A. Agrawal and A. Choudhary, Perspective: Materials informatics and big data: Realization of the “fourth paradigm” of science in materials science, APL Mater. 4(5), 053208 (2016)
Y. Xu, X. Liu, X. Cao, C. Huang, E. Liu, et al., Artificial intelligence: A powerful paradigm for scientific research, The Innovation 2(4), 100179 (2021)
G. Carleo, I. Cirac, K. Cranmer, L. Daudet, M. Schuld, N. Tishby, L. Vogt-Maranto, and L. Zdeborová, Machine learning and the physical sciences, Rev. Mod. Phys. 91, 045002 (2019)
G. R. Schleder, A. C. M. Padilha, C. M. Acosta, M. Costa, and A. Fazzio, From DFT to machine learning: Recent approaches to materials science–A review, J. Phys.: Mater. 2(3), 032001 (2019)
R. Potyrailo, K. Rajan, K. Stoewe, I. Takeuchi, B. Chisholm, and H. Lam, Combinatorial and high-throughput screening of materials libraries: Review of state of the art, ACS Combin. Sci. 13(6), 579 (2011)
K. Alberi, M. B. Nardelli, A. Zakutayev, L. Mitas, S. Curtarolo, et al., The 2019 materials by design roadmap, J. Phys. D 52(1), 013001 (2018)
S. Torquato, Optimal design of heterogeneous materials, Ann. Rev. Mater. Res. 40, 101 (2010)
A. A. White, Big data are shaping the future of materials science, MRS Bull. 38(8), 594 (2013)
Z. Fan, H. Q. Wang, and J. C. Zheng, Searching for the best thermoelectrics through the optimization of transport distribution function, J. Appl. Phys. 109(7), 073713 (2011)
J. C. Zheng and Y. Zhu, Searching for a higher superconducting transition temperature in strained MgB2, Phys. Rev. B 73, 024509 (2006)
J. C. Zheng, Asymmetrical transport distribution function: Skewness as a key to enhance thermoelectric performance, Research 2022, 9867639 (2022)
J. C. Zheng, Recent advances on thermoelectric materials, Front. Phys. China 3(3), 269 (2008)
J. C. Zheng, A. I. Frenkel, L. Wu, J. Hanson, W. Ku, E. S. Bozin, S. J. L. Billinge, and Y. Zhu, Nanoscale disorder and local electronic properties of CaCu3Ti4O12: An integrated study of electron, neutron, and X-ray diffraction, X-ray absorption fine structure, and first-principles calculations, Phys. Rev. B 81(14), 144203 (2010)
N. Sa, S. S. Chong, H. Q. Wang, and J. C. Zheng, Anisotropy engineering of ZnO nanoporous frameworks: A lattice dynamics simulation, Nanomaterials (Basel) 12(18), 3239 (2022)
H. Cheng and J. C. Zheng, Ab initio study of anisotropic mechanical and electronic properties of strained carbon-nitride nanosheet with interlayer bonding, Front. Phys. 16(4), 43505 (2021)
Y. Huang, C. Y. Haw, Z. Zheng, J. Kang, J. C. Zheng, and H. Q. Wang, Biosynthesis of zinc oxide nanomaterials from plant extracts and future green prospects: A topical review, Adv. Sustain. Syst. 5(6), 2000266 (2021)
Z. Q. Wang, H. Cheng, T. Y. Lü, H. Q. Wang, Y. P. Feng, and J. C. Zheng, A super-stretchable boron nanoribbon network, Phys. Chem. Chem. Phys. 20(24), 16510 (2018)
Y. Li, H. Q. Wang, T. J. Chu, Y. C. Li, X. Li, X. Liao, X. Wang, H. Zhou, J. Kang, K. C. Chang, T. C. Chang, T. M. Tsai, and J. C. Zheng, Tuning the nanostructures and optical properties of undoped and N-doped ZnO by supercritical fluid treatment, AIP Adv. 8(5), 055310 (2018)
Y. L. Li, Z. Fan, and J. C. Zheng, Enhanced thermoelectric performance in graphitic ZnO (0001) nanofilms, J. Appl. Phys. 113(8), 083705 (2013)
J. He, I. D. Blum, H. Q. Wang, S. N. Girard, J. Doak, L. D. Zhao, J. C. Zheng, G. Casillas, C. Wolverton, M. Jose-Yacaman, D. N. Seidman, M. G. Kanatzidis, and V. P. Dravid, Morphology control of nanostructures: Na-doped PbTe-PbS system, Nano Lett. 12(11), 5979 (2012)
Z. Fan, J. Zheng, H. Q. Wang, and J. C. Zheng, Enhanced thermoelectric performance in three-dimensional superlattice of topological insulator thin films, Nanoscale Res. Lett. 7(1), 570 (2012)
N. Wei, H. Q. Wang, and J. C. Zheng, Nanoparticle manipulation by thermal gradient, Nanoscale Res. Lett. 7(1), 154 (2012)
N. Wei, Z. Fan, L. Q. Xu, Y. P. Zheng, H. Q. Wang, and J. C. Zheng, Knitted graphene-nanoribbon sheet: A mechanically robust structure, Nanoscale 4(3), 785 (2012)
J. Q. He, J. R. Sootsman, L. Q. Xu, S. N. Girard, J. C. Zheng, M. G. Kanatzidis, and V. P. Dravid, Anomalous electronic transport in dual-nanostructured lead telluride, J. Am. Chem. Soc. 133(23), 8786 (2011)
N. Wei, L. Xu, H. Q. Wang, and J. C. Zheng, Strain engineering of thermal conductivity in graphene sheets and nanoribbons: A demonstration of magic flexibility, Nanotechnology 22(10), 105705 (2011)
J. He, J. R. Sootsman, S. N. Girard, J. C. Zheng, J. Wen, Y. Zhu, M. G. Kanatzidis, and V. P. Dravid, On the origin of increased phonon scattering in nanostructured PbTe-based thermoelectric materials, J. Am. Chem. Soc. 132(25), 8669 (2010)
Y. Zhu, J. C. Zheng, L. Wu, A. I. Frenkel, J. Hanson, P. Northrup, and W. Ku, Nanoscale disorder in CaCu3Ti4O12: A new route to the enhanced dielectric response, Phys. Rev. Lett. 99(3), 037602 (2007)
J. C. Zheng, H. Q. Wang, A. T. S. Wee, and C. H. A. Huan, Structural and electronic properties of Al nanowires: An ab initio pseudopotential study, Int. J. Nanosci. 01(02), 159 (2002)
J. C. Zheng, H. Q. Wang, A. T. S. Wee, and C. H. A. Huan, Possible complete miscibility of (BN)x(C2)1−x alloys, Phys. Rev. B 66(9), 092104 (2002)
J. C. Zheng, H. Q. Wang, C. H. A. Huan, and A. T. S. Wee, The structural and electronic properties of (AlN)x(C2)1−x and (AlN)xBN)1−x alloys, J. Phys.: Condens. Matter 13(22), 5295 (2001)
H. Q. Wang, J. C. Zheng, R. Z. Wang, Y. M. Zheng, and S. H. Cai, Valence-band offsets of III-V alloy heterojunctions, Surf. Interface Anal. 28(1), 177 (1999)
J. C. Zheng, R. Z. Wang, Y. M. Zheng, and S. H. Cai, Valence offsets of three series of alloy heterojunctions, Chin. Phys. Lett. 14(10), 775 (1997)
J. C. Zheng, Y. Zheng, and R. Wang, Valence offsets of ternary alloy heterojunctions InxGa1−xAs/InxAl1−xAs, Chin. Sci. Bull. 41(24), 2050 (1996)
L. Liu, T. Wang, L. Sun, T. Song, H. Yan, C. Li, D. Mu, J. Zheng, and Y. Dai, Stable cycling of all-solidstate lithium metal batteries enabled by salt engineering of PEO-based polymer electrolytes, Energy Environ. Mater. (Feb.), e12580 (2023)
W. Zhang, F. Y. Du, Y. Dai, and J. C. Zheng, Strain engineering of Li+ ion migration in olivine phosphate cathode materials LiMPO4 (M = Mn, Fe, Co) and (LiFePO4)n(LiMnPO4)m superlattices, Phys. Chem. Chem. Phys. 25(8), 6142 (2023)
B. Zhang, L. Wu, J. Zheng, P. Yang, X. Yu, J. Ding, S. M. Heald, R. A. Rosenberg, T. V. Venkatesan, J. Chen, C. J. Sun, Y. Zhu, and G. M. Chow, Control of magnetic anisotropy by orbital hybridization with charge transfer in (La0.67Sr0.33MnO3)n/(SrTiO3)n superlattice, NPG Asia Mater. 10(9), 931 (2018)
L. Zhang, T. Y. Lü, H. Q. Wang, W. X. Zhang, S. W. Yang, and J. C. Zheng, First principles studies on the thermoelectric properties of (SrO)m(SrTiO3)n superlattice, RSC Adv. 6(104), 102172 (2016)
J. C. Zheng, C. H. A. Huan, A. T. S. Wee, M. A. V. Hove, C. S. Fadley, F. J. Shi, E. Rotenberg, S. R. Barman, J. J. Paggel, K. Horn, P. Ebert, and K. Urban, Atomic scale structure of the 5-fold surface of a AlPdMn quasicrystal: A quantitative X-ray photoelectron diffraction analysis, Phys. Rev. B 69(13), 134107 (2004)
H. Q. Wang, J. Xu, X. Lin, Y. Li, J. Kang, and J. C. Zheng, Determination of the embedded electronic states at nanoscale interface via surface-sensitive photoemission spectroscopy, Light Sci. Appl. 10(1), 153 (2021)
M. A. Van Hove, K. Hermann, and P. R. Watson, The NIST surface structure database–SSD version 4, Acta Crystallogr. B 58(3), 338 (2002)
H. Q. Wang, E. Altman, C. Broadbridge, Y. Zhu, and V. Henrich, Determination of electronic structure of oxide-oxide interfaces by photoemission spectroscopy, Adv. Mater. 22, 2950 (2010)
H. Zhou, L. Wu, H. Q. Wang, J. C. Zheng, L. Zhang, K. Kisslinger, Y. Li, Z. Wang, H. Cheng, S. Ke, Y. Li, J. Kang, and Y. Zhu, Interfaces between hexagonal and cubic oxides and their structure alternatives, Nat. Commun. 8(1), 1474 (2017)
J. D. Steiner, H. Cheng, J. Walsh, Y. Zhang, B. Zydlewski, L. Mu, Z. Xu, M. M. Rahman, H. Sun, F. M. Michel, C. J. Sun, D. Nordlund, W. Luo, J. C. Zheng, H. L. Xin, and F. Lin, Targeted surface doping with reversible local environment improves oxygen stability at the electrochemical interfaces of nickel-rich cathode materials, ACS Appl. Mater. Interfaces 11(41), 37885 (2019)
J. C. Zheng, H. Q. Wang, A. T. S. Wee, and C. H. A. Huan, Trends in bonding configuration at SiC/III–V semiconductor interfaces, Appl. Phys. Lett. 79(11), 1643 (2001)
H. Q. Wang, J. C. Zheng, A. T. S. Wee, and C. H. A. Huan, Study of electronic properties and bonding configuration at the BN/SiC interface, J. Electron Spectrosc. Relat. Phenom. 114–116, 483 (2001)
S. Lin, B. Zhang, T. Y. Lü, J. C. Zheng, H. Pan, H. Chen, C. Lin, X. Li, and J. Zhou, Inorganic lead-free B-Y-CsSnI 3 perovskite solar cells using diverse electron-transporting materials: A simulation study, ACS Omega 6(40), 26689 (2021)
F. Y. Du, W. Zhang, H. Q. Wang, and J. C. Zheng, Enhancement of thermal rectification by asymmetry engineering of thermal conductivity and geometric structure for the multi-segment thermal rectifier, Chin. Phys. B 32(6), 064402 (2023)
M. Kulichenko, J. S. Smith, B. Nebgen, Y. W. Li, N. Fedik, A. I. Boldyrev, N. Lubbers, K. Barros, and S. Tretiak, The rise of neural networks for materials and chemical dynamics, J. Phys. Chem. Lett. 12(26), 6227 (2021)
W. Sha, Y. Guo, Q. Yuan, S. Tang, X. Zhang, S. Lu, X. Guo, Y. C. Cao, and S. Cheng, Artificial intelligence to power the future of materials science and engineering, Adv. Intell. Syst. 2(4), 1900143 (2020)
S. Leonelli, Scientific research and big data, in: The Stanford Encyclopedia of Philosophy, Summer 2020 Ed., edited by E. N. Zalta, Metaphysics Research Lab, Stanford University, 2020
J. Westermayr, M. Gastegger, K. T. Schutt, and R. J. Maurer, Perspective on integrating machine learning into computational chemistry and materials science, J. Chem. Phys. 154(23), 230903 (2021)
D. Morgan and R. Jacobs, Opportunities and challenges for machine learning in materials science, Annu. Rev. Mater. Res. 50(1), 71 (2020)
C. Chen, Y. Zuo, W. Ye, X. Li, Z. Deng, and S. P. Ong, A critical review of machine learning of energy materials, Adv. Energy Mater. 10(8), 1903242 (2020)
J. Wei, X. Chu, X. Y. Sun, K. Xu, H. X. Deng, J. Chen, Z. Wei, and M. Lei, Machine learning in materials science, InfoMat 1(3), 338 (2019)
G. Pilania, Machine learning in materials science: From explainable predictions to autonomous design, Comput. Mater. Sci. 193, 110360 (2021)
K. T. Butler, D. W. Davies, H. Cartwright, O. Isayev, and A. Walsh, Machine learning for molecular and materials science, Nature 559(7715), 547 (2018)
F. Oviedo, J. L. Ferres, T. Buonassisi, and K. T. Butler, Interpretable and explainable machine learning for materials science and chemistry, Acc. Mater. Res. 3(6), 597 (2022)
J. F. Rodrigues Jr, M. C. F. Florea, D. de Oliveira, D. Diamond, and O. N. Oliveira Jr, Big data and machine learning for materials science, Discover Materials 1(1), 12 (2021)
K. Choudhary, B. DeCost, C. Chen, A. Jain, F. Tavazza, R. Cohn, C. W. Park, A. Choudhary, A. Agrawal, S. J. L. Billinge, E. Holm, S. P. Ong, and C. Wolverton, Recent advances and applications of deep learning methods in materials science, npj Comput. Mater. 8, 59 (2022)
L. Samuel, Some studies in machine learning using the game of checkers, IBM J. Res. Develop. 3(3), 210 (1959)
L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and Regression Trees, 1983
L. G. Valiant, A theory of the learnable, in: STOC’ 84 Proceedings of the Sixteenth Annual ACM Symposium on Theory of Computing, pp 436–445, 1984
T. Mitchell, Machine Learning, New York, USA: McGrawHill, 1997
S. Roweis and Z. Ghahramani, A unifying review of linear gaussian models, Neural Comput. 11(2), 305 (1999)
J. C. Zheng, J. Y. Chen, J. W. Shuai, S. H. Cai, and R. Z. Wang, Storage capacity of the Hopfield neural network, Physica A 246(3), 313 (1997)
J. W. Shuai, J. C. Zheng, Z. X. Chen, R. T. Liu, and B. X. Wu, The three-dimensional rotation neural network, Physica A 238), 23 (1997)
M. Mohri, A. Rostamizadeh, and A. Talwalkar, Foundations of Machine Learning, 2nd Ed., Adaptive Computation and Machine Learning. Cambridge, MA: MIT Press, 2018
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, Attention is all you need, arXiv: 1706.03762 (2017)
A. Wang, Y. Pruksachatkun, N. Nangia, A. Singh, J. Michael, F. Hill, O. Levy, and S. R. Bowman, Super-GLUE: A stickier benchmark for general-purpose language understanding systems, arXiv: 1905.00537 (2019)
D. Erhan, Y. Bengio, A. Courville, P. A. Manzagol, P. Vincent, and S. Bengio, Why does unsupervised pre-training help deep learning, J. Mach. Learn. Res. 11, 625 (2010)
Z. Feng, D. Guo, D. Tang, N. Duan, X. Feng, M. Gong, L. Shou, B. Qin, T. Liu, D. Jiang, and M. Zhou, CodeBERT: A pre-trained model for programming and natural languages, arXiv: 2002.08155 (2020)
H. Bao, L. Dong, and F. Wei, BEIT: BERT pre-training of image transformers, arXiv: 2106.08254 (2021)
K. Hakhamaneshi, M. Nassar, M. Phielipp, P. Abbeel, and V. Stojanović, Pretraining graph neural networks for few-shot analog circuit modeling and design, arXiv: 2203.15913 (2022)
J. Li, D. Li, C. Xiong, and S. Hoi, BLIP: Bootstrapping language-image pre-training for unified vision-language understanding and generation, arXiv: 2201.12086 (2022)
K. Lu, A. Grover, P. Abbeel, and I. Mordatch, Pretrained transformers as universal computation engines, arXiv: 2103.05247 (2021)
M. Reid, Y. Yamada, and S. S. Gu, Can Wikipedia help offline reinforcement learning? arXiv: 2201.12122 (2022)
C. Sun, X. Qiu, Y. Xu, and X. Huang, How to fine-tune BERT for text classification? arXiv: 1905.05583 (2019)
H. Liu, D. Tam, M. Muqeeth, J. Mohta, T. Huang, M. Bansal, and C. Raffel, Few-shot parameter-efficient fine-tuning is better and cheaper than in-context learning, Advances in Neural Information Processing Systems 35, 1950 (2022)
J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, BERT: Pre-training of deep bidirectional transformers for language understanding, arXiv: 1810.04805 (2018)
Z. Lan, M. Chen, S. Goodman, K. Gimpel, P. Sharma, and R. Soricut, ALBERT: A lite BERT for self-supervised learning of language representations, arXiv: 1909.11942 (2019)
Y. Liu, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen, O. Levy, M. Lewis, L. Zettlemoyer, and V. Stoyanov, ROBERTA: A robustly optimized BERT pretraining approach, arXiv: 1907.11692 (2019)
J. Vig and Y. Belinkov, Analyzing the structure of attention in a transformer language model, arXiv: 1906.04284 (2019)
S. Zhang and L. Xie, Improving attention mechanism in graph neural networks via cardinality preservation, in: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, International Joint Conferences on Artificial Intelligence Organization, 2020, page 1395
Y. Tay, V. Q. Tran, M. Dehghani, J. Ni, D. Bahri, H. Mehta, Z. Qin, K. Hui, Z. Zhao, J. Gupta, T. Schuster, W. W. Cohen, and D. Metzler, Transformer memory as a differentiable search index, Advances in Neural Information Processing Systems 35, 21831 (2022)
C. Raffel, N. Shazeer, A. Roberts, K. Lee, S. Narang, M. Matena, Y. Zhou, W. Li, and P. J. Liu, Exploring the limits of transfer learning with a unified text-to-text transformer, J. Mach. Learn. Res. 21(1), 5485 (2020)
T. Shin, and Y. Razeghi, R. L. L. IV, E. Wallace, and S. Singh, AutoPrompt: Eliciting knowledge from language models with automatically generated prompts, arXiv: 2010.15980 (2020)
N. Ding, S. Hu, W. Zhao, Y. Chen, Z. Liu, H.-T. Zheng, and M. Sun, Openprompt: An open-source framework for prompt-learning, arXiv: 2111.01998 (2021)
S. Zhang, S. Roller, N. Goyal, M. Artetxe, M. Chen, S. Chen, C. Dewan, M. Diab, X. Li, X. V. Lin, T. Mihaylov, M. Ott, S. Shleifer, K. Shuster, D. Simig, P. S. Koura, A. Sridhar, T. Wang, and L. Zettlemoyer, OPT: Open pre-trained transformer language models, arXiv: 2205.01068 (2022)
O. Lieber, O. Sharir, B. Lenz, and Y. Shoham, Jurassic-1: Technical Details and Evaluation, AI21 Labs, Tech. Rep., 2021
T. Brown, B. Mann, N. Ryder, M. Subbiah, J. D. Kaplan, et al., Language models are few-shot learners, in: Advances in Neural Information Processing Systems, edited by H. Larochelle, M. Ranzato, R. Hadsell, M. Balcan, and H. Lin, 33 Curran Associates, Inc., 2020, pp 1877–1901, arXiv: 2005.14165
A. Bapna, I. Caswell, J. Kreutzer, O. Firat, D. van Esch, et al., Building machine translation systems for the next thousand languages, arXiv: 2205.03983 (2022)
T. Mikolov, K. Chen, G. Corrado, and J. Dean, Efficient estimation of word representations in vector space, arXiv: 1301.3781 (2013)
J. Pennington, R. Socher, and C. Manning, GloVe: Global vectors for word representation, in: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, Qatar: Association for Computational Linguistics, Oct. 2014, pp 1532–1543
O. Melamud, J. Goldberger, and I. Dagan, Context2vec: Learning generic context embedding with bidirectional LSTM, in: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. Berlin, Germany: Association for Computational Linguistics, Aug. 2016, pp 51–61
H. Dai, B. Dai, and L. Song, Discriminative embeddings of latent variable models for structured data, arXiv: 1603.05629 (2016)
J. Yang, R. Zhao, M. Zhu, D. Hallac, J. Sodnik, and J. Leskovec, Driver2vec: Driver identification from automotive data, arXiv: 2102.05234 (2021)
S. Schneider, A. Baevski, R. Collobert, and M. Auli, Wav2vec: Unsupervised pre-training for speech recognition, arXiv: 1904.05862 (2019)
H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, and W. Zhang, Informer: Beyond efficient transformer for long sequence time-series forecasting, in: Proceedings of the AAAI Conference on Artificial Intelligence 35(12), 11106 (2021), arXiv: 2012.07436
I. Beltagy, M. E. Peters, and A. Cohan, Longformer: The long-document transformer, arXiv: 2004.05150 (2020)
K. Han, Y. Wang, H. Chen, X. Chen, J. Guo, Z. Liu, Y. Tang, A. Xiao, C. Xu, Y. Xu, Z. Yang, Y. Zhang, and D. Tao, A survey on vision transformer, IEEE Trans. Pattern Anal. Mach. Intell. 45(1), 87 (2023)
J. B. Alayrac, J. Donahue, P. Luc, A. Miech, I. Barr, et al., Flamingo: A visual language model for few-shot learning, Advances in Neural Information Processing Systems 35, 23716 (2022), arXiv: 2204.14198
J. Yu, Z. Wang, V. Vasudevan, L. Yeung, M. Seyedhosseini, and Y. Wu, COCA: Contrastive captioners are image-text foundation models, arXiv: 2205.01917 (2022)
X. Liu, C. Gong, L. Wu, S. Zhang, H. Su, and Q. Liu, Fusedream: Training-free text-to-image generation with improved CLIP+GAN space optimization, arXiv: 2112.01573 (2021)
A. Radford, J. W. Kim, C. Hallacy, A. Ramesh, G. Goh, S. Agarwal, G. Sastry, A. Askell, P. Mishkin, J. Clark, G. Krueger, and I. Sutskever, Learning transferable visual models from natural language supervision, arXiv: 2103.00020 (2021)
L. He, Q. Zhou, X. Li, L. Niu, G. Cheng, X. Li, W. Liu, Y. Tong, L. Ma, and L. Zhang, End-to-end video object detection with spatial-temporal transformers, in: Proceedings of the 29th ACM International Conference on Multimedia, 2021, pp 1507–1516, arXiv: 2105.10920
X. Zhai, X. Wang, B. Mustafa, A. Steiner, D. Keysers, A. Kolesnikov, and L. Beyer, LIT: Zero-shot transfer with locked-image text tuning, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp 18123–18133, arXiv: 2111.07991
A. Trockman and J. Z. Kolter, Patches are all you need? arXiv: 2201.09792 (2022)
A. Ramesh, M. Pavlov, G. Goh, S. Gray, C. Voss, A. Radford, M. Chen, and I. Sutskever, Zeroshot text-to-image generation, in: International Conference on Machine Learning, 2021, pp 8821–8831, arXiv: 2102.12092
A. Tewari, J. Thies, B. Mildenhall, P. Srinivasan, E. Tretschk, Y. Wang, C. Lassner, V. Sitzmann, R. Martin-Brualla, S. Lombardi, T. Simon, C. Theobalt, M. Niessner, J. T. Barron, G. Wetzstein, M. Zollhoefer, and V. Golyanik, Advances in neural rendering, Computer Graphics Forum 41(2), 703 (2022), arXiv: 2111.05849
B. Mildenhall, P. P. Srinivasan, M. Tancik, J. T. Barron, R. Ramamoorthi, and R. Ng, NERF: Representing scenes as neural radiance fields for view synthesis, Communications of the ACM 65(1), 99 (2021), arXiv: 2003.08934
S. Zheng, J. Pan, C. Lu, and G. Gupta, Pointnorm: Normalization is all you need for point cloud analysis, arXiv: 2207.06324 (2022)
H. Ran, J. Liu, and C. Wang, Surface representation for point clouds, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp 18942–18952, arXiv: 2205.05740
X. Ma, C. Qin, H. You, H. Ran, and Y. Fu, Rethinking network design and local geometry in point cloud: A simple residual MLP framework, arXiv: 2202.07123 (2022)
Y. Wang, Y. Sun, Z. Liu, S. E. Sarma, M. M. Bronstein, and J. M. Solomon, Dynamic graph CNN for learning on point clouds, arXiv: 1801.07829 (2018)
D. Silver, J. Schrittwieser, K. Simonyan, I. Antonoglou, A. Huang, A. Guez, T. Hubert, L. Baker, M. Lai, A. Bolton, Y. Chen, T. Lillicrap, F. Hui, L. Sifre, G. van den Driessche, T. Graepel, and D. Hassabis, Mastering the game of Go without human knowledge, Nature 550(7676), 354 (2017)
E. Zhao, R. Yan, J. Li, K. Li, and J. Xing, Alphaholdem: High-performance artificial intelligence for heads-up no-limit poker via end-to-end reinforcement learning, in: Proceedings of the AAAI Conference on Artificial Intelligence 36(4), 4689 (2022)
S. Zou, T. Xu, and Y. Liang, Finite-sample analysis for SARSA with linear function approximation, arXiv: 1902.02234 (2019)
C. J. C. H. Watkins and P. Dayan, Q-learning, Machine Learning 8(3), 279 (1992)
P. Abbeel and A. Y. Ng, Apprenticeship learning via inverse reinforcement learning, in Proceedings of the Twenty-First International Conference on Machine Learning, Ser. ICML’ 04. New York, NY, USA: Association for Computing Machinery, 2004
C. Finn, P. Abbeel, and S. Levine, Model-agnostic meta-learning for fast adaptation of deep networks, In International conference on machine learning, 2017, pp 1126–1135, arXiv: 1703.03400
C. Fifty, E. Amid, Z. Zhao, T. Yu, R. Anil, and C. Finn, Efficiently identifying task groupings for multitask learning, Advances in Neural Information Processing Systems 34, 27503 (2021), arXiv: 2109.04617
N. Anand and D. Precup, Preferential temporal difference learning, arXiv: 2106.06508 (2021)
K. Chen, R. Cao, S. James, Y. Li, Y. H. Liu, P. Abbeel, and Q. Dou, Sim-to-real 6d object pose estimation via iterative self-training for robotic bin-picking, in: Computer Vision-ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXXIX (pp 533–550). Cham: Springer Nature Switzerland, arXiv: 2204.07049
V. Mnih, K. Kavukcuoglu, D. Silver, A. Graves, I. Antonoglou, D. Wierstra, and M. Riedmiller, Playing atari with deep reinforcement learning, arXiv: 1312.5602 (2013)
T. P. Lillicrap, J. J. Hunt, A. Pritzel, N. Heess, T. Erez, Y. Tassa, D. Silver, and D. Wierstra, Continuous control with deep reinforcement learning, arXiv: 1509.02971 (2015)
D. Yarats, D. Brandfonbrener, H. Liu, M. Laskin, P. Abbeel, A. Lazaric, and L. Pinto, Don’t change the algorithm, change the data: Exploratory data for offline reinforcement learning, arXiv: 2201.13425 (2022)
M. Ahn, A. Brohan, N. Brown, Y. Chebotar, O. Cortes, et al., Do as I can, not as I say: Grounding language in robotic affordances, in: Conference on Robot Learning, 2023, pp 287–318, arXiv: 2204.01691
S. James and P. Abbeel, Coarse-to-fine Q-attention with learned path ranking, arXiv: 2204.01571 (2022)
C. Qi, P. Abbeel, and A. Grover, Imitating, fast and slow: Robust learning from demonstrations via decision-time planning, arXiv: 2204.03597 (2022)
L. Wang, X. Zhang, K. Yang, L. Yu, C. Li, L. Hong, S. Zhang, Z. Li, Y. Zhong, and J. Zhu, Memory replay with data compression for continual learning, arXiv: 2202.06592 (2022)
L. Chen, K. Lu, A. Rajeswaran, K. Lee, A. Grover, M. Laskin, P. Abbeel, A. Srinivas, and I. Mordatch, Decision transformer: Reinforcement learning via sequence modeling, Advances in Neural Information Processing Systems 34, 15084 (2021), arXiv: 2106.01345
J. Parker-Holder, M. Jiang, M. Dennis, M. Samvelyan, J. Foerster, E. Grefenstette, and T. Rocktäschel, Evolving curricula with regret-based environment design, in: International Conference on Machine Learning, 2022, pp 17473–17498, arXiv: 2203.01302
R. Wang, J. Lehman, J. Clune, and K. O. Stanley, Paired open-ended trailblazer (POET): Endlessly generating increasingly complex and diverse learning environments and their solutions, arXiv: 1901.01753 (2019)
Z. Li, L. Li, Z. Ma, P. Zhang, J. Chen, and J. Zhu, Read: Large-scale neural scene rendering for autonomous driving, arXiv: 2205.05509 (2022)
W. Tang, C. J. Ho, and Y. Liu, Bandit learning with delayed impact of actions, in: Advances in Neural Information Processing Systems, edited by A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, 2021, arXiv: 1904.01763
Z. Gao, Y. Han, Z. Ren, and Z. Zhou, Batched multi-armed bandits problem, in: Advances in Neural Information Processing Systems, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché-Buc, E. Fox, and R. Garnett, Curran Associates, Inc., 2019, arXiv: 1904.01763
Y. Yue, J. Broder, R. Kleinberg, and T. Joachims, The k-armed dueling bandits problem, J. Comput. Syst. Sci. 78(5), 1538 (2012)
A. Carpentier, A. Lazaric, M. Ghavamzadeh, R. Munos, P. Auer, and A. Antos, Upperconfidence-bound algorithms for active learning in multi-armed bandits, in: Algorithmic Learning Theory: 22nd International Conference, ALT 2011, Espoo, Finland, October 5–7, 2011. Proceedings 22 (pp 189–203), Springer Berlin Heidelberg, arXiv: 1507.04523
W. Ye, S. Liu, T. Kurutach, P. Abbeel, and Y. Gao, Mastering Atari games with limited data, Advances in Neural Information Processing Systems 34, 25476 (2021), arXiv: 2111.00210
M. Samvelyan, T. Rashid, C. Schroeder de Witt, G. Farquhar, N. Nardelli, T. G. J. Rudner, C. M. Hung, P. H. S. Torr, J. Foerster, and S. Whiteson, The StarCraft multi-agent challenge, in: Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems, 2019, arXiv: 1902.04043
T. Wang, T. Gupta, A. Mahajan, B. Peng, S. Whiteson, and C. Zhang, Rode: Learning roles to decompose multi-agent tasks, arXiv: 2010.01523 (2020)
O. Vinyals, I. Babuschkin, W. M. Czarnecki, M. Mathieu, A. Dudzik, et al., Grandmaster level in StarCraft II using multi-agent reinforcement learning, Nature 575(7782), 350 (2019)
W. Du and S. Ding, A survey on multi-agent deep reinforcement learning: From the perspective of challenges and applications, Artif. Intell. Rev. 54(5), 3215 (2021)
J. Biamonte, P. Wittek, N. Pancotti, P. Rebentrost, N. Wiebe, and S. Lloyd, Quantum machine learning, Nature 549, 195 (2017)
Y. Liu, S. Arunachalam, and K. Temme, A rigorous and robust quantum speed-up in supervised machine learning, Nat. Phys. 17(9), 1013 (2021)
V. Havlíček, A. D. Córcoles, K. Temme, A. W. Harrow, A. Kandala, J. M. Chow, and J. M. Gambetta, Supervised learning with quantum-enhanced feature spaces, Nature 567(7747), 209 (2019)
S. Moradi, C. Brandner, C. Spielvogel, D. Krajnc, S. Hillmich, R. Wille, W. Drexler, and L. Papp, Clinical data classification with noisy intermediate scale quantum computers, Sci. Rep. 12(1), 1851 (2022)
J. Zheng, K. He, J. Zhou, Y. Jin, and C. M. Li, Combining reinforcement learning with lin-kernighan-helsgaun algorithm for the traveling salesman problem, in: Proceedings of the AAAI Conference on Artificial Intelligence 35(14), 12445 (2021), arXiv: 2012.04461
Z. Li, Q. Chen, and V. Koltun, Combinatorial optimization with graph convolutional networks and guided tree search, Advances in Neural Information Processing Systems 31, 2018, arXiv: 1810.10659
M. Sundararajan, A. Taly, and Q. Yan, Axiomatic attribution for deep networks, in: International Conference on Machine Learning, 2017, pp 3319–3328, arXiv: 1703.01365
M. T. Ribeiro, S. Singh, and C. Guestrin, Why Should I Trust You? Explaining the predictions of any classifïer, in: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp 1135–1144, arXiv: 1602.04938
S. Lundberg and S. I. Lee, A unified approach to interpreting model predictions, arXiv: 1705.07874 (2017)
J. Crabbe, Z. Qian, F. Imrie, and M. van der Schaar, Explaining latent representations with a corpus of examples, in: Advances in Neural Information Processing Systems, edited by M. Ranzato, A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, Curran Associates, Inc., 2021, pp 12154–12166, arXiv: 2110.15355
J. T. Springenberg, A. Dosovitskiy, T. Brox, and M. Riedmiller, Striving for simplicity: The all convolutional net, arXiv: 1412.6806 (2014)
R. Ying, D. Bourgeois, J. You, M. Zitnik, and J. Leskovec, Gnnexplainer: Generating explanations for graph neural networks, arXiv: 1903.03894 (2019)
H. Yuan, H. Yu, J. Wang, K. Li, and S. Ji, On explainability of graph neural networks via subgraph explorations, in: International Conference on Machine Learning, 2021, pp 12241–12252, arXiv: 2102.05152
Q. Huang, M. Yamada, Y. Tian, D. Singh, D. Yin, and Y. Chang, GraphLIME: Local interpretable model explanations for graph neural networks, IEEE Transactions on Knowledge and Data Engineering, 35(7), 6968 (2023), arXiv: 2001.06216
H. Yuan, H. Yu, S. Gui, and S. Ji, Explainability in graph neural networks: A taxonomic survey, IEEE Transactions on Pattern Analysis and Machine Intelligence 45(5), 5782 (2023), arXiv: 2012.15445
G. Katz, C. Barrett, D. Dill, K. Julian, and M. Kochenderfer, ReLUPlex: An efficient smt solver for verifying deep neural networks, in: Computer Aided Verification: 29th International Conference, CAV 2017, Heidelberg, Germany, July 24–28, 2017, Proceedings, Part I 30, pp 97–117. Springer International Publishing, arXiv: 1702.01135
S. Wang, H. Zhang, K. Xu, X. Lin, S. Jana, C. J. Hsieh, and J. Z. Kolter, Beta-CROWN: Efficient bound propagation with per-neuron split constraints for complete and incomplete neural network verification, Advances in Neural Information Processing Systems 34, 2021, arXiv: 2103.06624
M. P. Owen, A. Panken, R. Moss, L. Alvarez, and C. Leeper, ACAS Xu: Integrated collision avoidance and detect and avoid capability for UAS, in: IEEE/AIAA 38th Digital Avionics Systems Conference (DASC), 2019
S. Mittal and S. Vaishay, A survey of techniques for optimizing deep learning on GPUS, J. Systems Archit. 99, 101635 (2019)
F. Wang, W. Zhang, S. Lai, M. Hao, and Z. Wang, Dynamic GPU energy optimization for machine learning training workloads, IEEE Transactions on Parallel and Distributed Systems 33(11), 2943 (2022)
R. David, J. Duke, A. Jain, V. Janapa Reddi, N. Jeffries, J. Li, N. Kreeger, I. Nappier, M. Natraj, T. Wang, P. Warden, and R. Rhodes, Tensorflow lite micro: Embedded machine learning for tinyML systems, in: Proceedings of Machine Learning and Systems, edited by A. Smola, A. Dimakis, and I. Stoica, 2021, pp 800–811, arXiv: 2010.08678
C. Tanasescu, V. Kesarwani, and D. Inkpen, Metaphor detection by deep learning and the place of poetic metaphor in digital humanities, in: The Thirty-First International Flairs Conference, 2018
H. Surden, Machine learning and law, Wash. L. Rev. 89, 87 (2014)
J. De Spiegeleer, D. B. Madan, S. Reyners, and W. Schoutens, Machine learning for quantitative finance: Fast derivative pricing, hedging and fitting, Quantitative Finance 18(10), 1635–1643, 2018
W. Solano-Alvarez, M. Peet, E. Pickering, J. Jaiswal, A. Bevan, and H. Bhadeshia, Synchrotron and neural network analysis of the influence of composition and heat treatment on the rolling contact fatigue of hypereutectoid pearlitic steels, Materials Science and Engineering A 707, 259 (2017)
J. J. Li, Y. Dai, and J. C. Zheng, Strain engineering of ion migration in LiCoO2, Front. Phys. 17(1), 13503 (2022)
H. K. D. H. Bhadeshia, Neural networks and information in materials science, Statistical Analysis and Data Mining 1, 296 (2009)
Y. Liu, O. C. Esan, Z. Pan, and L. An, Machine learning for advanced energy materials, Energy and AI 3, 100049 (2021)
S. R. Kalidindi, Feature engineering of material structure for AI-based materials knowledge systems, J. Appl. Phys. 128(4), 041103 (2020)
Z. Xiang, M. Fan, G. Vázquez Tovar, W. Trehengekrn, B. J. Yoon, X. Qian, R. Arroyave, and X. Qian, Physics-constrained automatic feature engineering for predictive modeling in materials science, in: Proceedings of the AAAI Conference on Artificial Intelligence 35(12), pp 10414–10421 (2021)
Y. Bengio, A. Courville, and P. Vincent, Representation learning: A review and new perspectives, IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798 (2013)
P. K. Routh, Y. Liu, N. Marcella, B. Kozinsky, and A. I. Frenkel, Latent representation learning for structural characterization of catalyst, J. Phys. Chem. Lett. 12(8), 2086 (2021)
A. Franceschetti and A. Zunger, The inverse band-structure problem of finding an atomic configuration with given electronic properties, Nature 402(6757), 6757 (1999)
Z. Liu, D. Zhu, L. Raju, and W. Cai, Tackling photonic inverse design with machine learning, Adv. Sci. 8, 2002923 (2021)
J. E. Saal, S. Kirklin, M. Aykol, B. Meredig, and C. Wolverton, Materials design and discovery with high-throughput density functional theory: The open quantum materials database (OQMD), JOM 65(11), 1501 (2013)
S. Kirklin, J. E. Saal, B. Meredig, A. Thompson, J. W. Doak, M. Aykol, S. Rühl, and C. Wolverton, The open quantum materials database (OQMD): Assessing the accuracy of DFT formation energies, npj Comput. Mater. 1(1), 15010 (2015)
A. Jain, S. P. Ong, G. Hautier, W. Chen, W. D. Richards, S. Dacek, S. Cholia, D. Gunter, D. Skinner, G. Ceder, and K. Persson, The materials project: A materials genome approach to accelerating materials innovation, APL Mater. 1(1), 011002 (2013)
K. Choudhary, K. F. Garrity, A. C. E. Reid, B. DeCost, A. J. Biacchi, et al., The joint automated repository for various integrated simulations (JARVIS) for data-driven materials design, npj Comput. Mater. 6(1), 173 (2020)
AFLOW, URL: aflowlib.org
MatCloud, URL: matcloud.com.cn
MPDS, Pauling File, URL: mpds.io
NOMAD, URL: nomad-lab.eu
C2DB, URL: cmr.fysik.dtu.dk/c2db/c2db.html
J. Zhou, L. Shen, M. D. Costa, K. A. Persson, S. P. Ong, P. Huck, Y. Lu, X. Ma, Y. Chen, H. Tang, and Y. P. Feng, 2dmatpedia, an open computational database of two-dimensional materials from top-down and bottom-up approaches, Scientific Data 6, 86, June 2019
M. Hellenbrandt, The inorganic crystal structure database (ICSD) — Present and future, Crystallography Rev. 10(1), 17 (2004)
S. Gražulis, A. Daškevič, A. Merkys, D. Chateigner, L. Lutterotti, M. Quirós, N. R. Serebryanaya, P. Moeck, R. T. Downs, and A. Le Bail, Crystallography Open Database (COD): An open-access collection of crystal structures and platform for world-wide collaboration, Nucleic Acids Research 40(D1), D420 (2011)
J. C. Zheng, L. Wu, and Y. Zhu, Aspherical electron scattering factors and their parameterizations for elements from H to Xe, Journal of Applied Crystallography 42, 1043 (2009)
J. Jumper, R. Evans, A. Pritzel, T. Green, M. Figurnov, et al., Highly accurate protein structure prediction with alphafold, Nature 596(7873), 583 (2021)
A. Dunn, Q. Wang, A. Ganose, D. Dopp, and A. Jain, Benchmarking materials property prediction methods: The matbench test set and automatminer reference algorithm, npj Comput. Mater. 6(1), 138 (2020)
R. Lin, R. Zhang, C. Wang, X. Q. Yang, and H. L. Xin, Temimagenet training library and atomsegnet deep-learning models for high-precision atom segmentation, localization, denoising, and deblurring of atomic-resolution images, Sci. Rep. 11(1), 5386 (2021)
L. Han, H. Cheng, W. Liu, H. Li, P. Ou, R. Lin, H.-T. Wang, C.-W. Pao, A. R. Head, C.-H. Wang, X. Tong, C.-J. Sun, W.-F. Pong, J. Luo, J.-C. Zheng, and H. L. Xin, A single-atom library for guided monometallic and concentration-complex multimetallic designs, Nat. Mater. 21, 681 (2022)
D. Mrdjenovich, M. K. Horton, J. H. Montoya, C. M. Legaspi, S. Dwaraknath, V. Tshitoyan, A. Jain, and K. A. Persson, Propnet: A knowledge graph for materials science, Matter 2(2), 464 (2020)
T. S. Lin, C. W. Coley, H. Mochigase, H. K. Beech, W. Wang, Z. Wang, E. Woods, S. L. Craig, J. A. Johnson, J. A. Kalow, K. F. Jensen, and B. D. Olsen, Bigsmiles: A structurally-based line notation for describing macromolecules, ACS Cent. Sci. 5(9), 1523 (2019)
M. Krenn, Q. Ai, S. Barthel, N. Carson, A. Frei, et al., Selfies and the future of molecular string representations, Patterns 3(10), 100588 (2022)
K. Michel and B. Meredig, Beyond bulk single crystals: A data format for all materials structure-property–processing relationships, MRS Bull. 41(8), 617 (2016)
M. Wang, D. Zheng, Z. Ye, Q. Gan, M. Li, X. Song, J. Zhou, C. Ma, L. Yu, Y. Gai, T. Xiao, T. He, G. Karypis, J. Li, and Z. Zhang, Deep graph library: A graph-centric, highly-performant package for graph neural networks, arXiv: 1909.01315 (2019)
I. Babuschkin, K. Baumli, A. Bell, S. Bhupatiraju, J. Bruce, et al., The DeepMind JAX Ecosystem, 2020
F. Chollet, et al., Keras, URL: github.com/fchollet/keras (2015)
A. Paszke, S. Gross, S. Chintala, G. Chanan, E. Yang, Z. DeVito, Z. Lin, A. Desmaison, L. Antiga, and A. Lerer, Automatic differentiation in PYTORCH, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 2017
M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, et al., TensorFlow: Large-scale machine learning on heterogeneous systems, 2015, URL: www.tensorflow.org
T. Wolf, L. Debut, V. Sanh, J. Chaumond, C. Delangue, A. Moi, P. Cistac, T. Rault, R. Louf, M. Funtowicz, J. Davison, S. Shleifer), von Platen, C. Ma, Y. Jernite, J. Plu, C. Xu, T. L. Scao, S. Gugger, M. Drame, Q. Lhoest, and A. M. Rush, Huggingface’s transformers: State-of-the-art natural language processing, arXiv: 1910.03771 (2019)
Openrefine: A free, open source, powerful tool for working with messy data, URL: openrefine.org, 2022
PyG-Team, PYG documentation, URL: pytorchgeometric.readthedocs.io/en/latest/, 2022
PytorchLightning, URL: www.pytorchlightning.ai, 2022
GitHub - Netflix/vectorflow, URL: github.com/Netflix/vectorflow, 2022
L. Biewald, Experiment tracking with weights and biases, URL: www.wandb.com, 2020
L. Himanen, M. O. Jäger, E. V. Morooka, F. F. Canova, Y. S. Ranawat, D. Z. Gao, P. Rinke, and A. S. Foster, Dscribe: Library of descriptors for machine learning in materials science, Comput. Phys. Commun. 247, 106949 (2020)
W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, Open graph benchmark: Datasets for machine learning on graphs, Advances in neural information processing systems 33, 22118 (2020), arXiv: 2005.00687
O. Source, Rdkit: Open-source cheminformatics software, URL: www.rdkit.org, 2022
D. Grattarola, Spektral, URL: graphneural.network, 2022
S. Li, Y. Liu, D. Chen, Y. Jiang, Z. Nie, and F. Pan, Encoding the atomic structure for machine learning in materials science, Wiley Interdiscip. Rev. Comput. Mol. Sci. 12(1) (2022)
J. Schmidt, M. R. G. Marques, S. Botti, and M. A. L. Marques, Recent advances and applications of machine learning in solid-state materials science, npj Comput. Mater. 5(1), 83 (2019)
M. Rupp, A. Tkatchenko, K. R. Müller, and O. A. von Lilienfeld, Fast and accurate modeling of molecular atomization energies with machine learning, Phys. Rev. Lett. 108(5), 058301 (2012)
J. Schrier, Can one hear the shape of a molecule (from its Coulomb matrix eigenvalues), J. Chem. Inf. Model. 60(8), 3804 (2020)
M. McCarthy and K. L. K. Lee, Molecule identification with rotational spectroscopy and probabilistic deep learning, J. Phys. Chem. A 124(15), 3002 (2020)
F. Faber, A. Lindmaa, O. A. von Lilienfeld, and R. Armiento, Crystal structure representations for machine learning models of formation energies, Int. J. Quantum Chem. 115(16), 1094 (2015)
K. T. Schütt, H. Glawe, F. Brockherde, A. Sanna, K. R. Müller, and E. K. U. Gross, How to represent crystal structures for machine learning: Towards fast prediction of electronic properties, Phys. Rev. B 89(20), 205118 (2014)
J. Behler and M. Parrinello, Generalized neural-network representation of high-dimensional potential-energy surfaces, Phys. Rev. Lett. 98(14), 146401 (2007)
J. Behler, Atom-centered symmetry functions for constructing high-dimensional neural network potentials, J. Chem. Phys. 134(7), 074106 (2011)
A. Seko, A. Takahashi, and I. Tanaka, Sparse representation for a potential energy surface, Phys. Rev. B 90(2), 024101 (2014)
M. Gastegger, L. Schwiedrzik, M. Bittermann, F. Berzsenyi, and P. Marquetand, WACSF — Weighted atom-centered symmetry functions as descriptors in machine learning potentials, J. Chem. Phys. 148(24), 241709 (2018)
A. P. Bartók, R. Kondor, and G. Csányi, On representing chemical environments, Phys. Rev. B 87(18), 184115 (2013)
C. W. Rosenbrock, E. R. Homer, G. Csányi, and G. L. W. Hart, Discovering the building blocks of atomic systems using machine learning: Application to grain boundaries, npj Comput. Mater. 3, 29 (2017)
F. M. Paruzzo, A. Hofstetter, F. Musil, S. De, M. Ceriotti, and L. Emsley, Chemical shifts in molecular solids by machine learning, Nat. Commun. 9(1), 4501 (2018)
A. S. Rosen, S. M. Iyer, D. Ray, Z. Yao, A. Aspuru-Guzik, L. Gagliardi, J. M. Notestein, and R. Q. Snurr, Machine learning the quantum-chemical properties of metal–organic frameworks for accelerated materials discovery, Matter 4(5), 1578 (2021)
Z. Fan, Z. Zeng, C. Zhang, Y. Wang, K. Song, H. Dong, Y. Chen, and T. Ala-Nissila, Neuroevolution machine learning potentials: Combining high accuracy and low cost in atomistic simulations and application to heat transport, Phys. Rev. B 104(10), 104309 (2021)
Z. Mihalić and N. Trinajstić, A graph-theoretical approach to structure-property relationships, J. Chem. Educ. 69(9), 701 (1992)
O. Isayev, C. Oses, C. Toher, E. Gossett, S. Curtarolo, and A. Tropsha, Universal fragment descriptors for predicting properties of inorganic crystals, Nat. Commun. 8(1), 15679 (2017)
T. Xie and J. C. Grossman, Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties, Phys. Rev. Lett. 120(14), 145301 (2018)
K. Xia and G. W. Wei, Persistent homology analysis of protein structure, flexibility and folding, Int. J. Numer. Methods Biomed. Eng. 30(8), 814 (2014)
Z. Cang, L. Mu, K. Wu, K. Opron, K. Xia, and G. W. Wei, A topological approach for protein classification, Comput. Math. Biophys. 3(1) (2015)
Y. Jiang, D. Chen, X. Chen, T. Li, G.-W. Wei, and F. Pan, Topological representations of crystalline compounds for the machine-learning prediction of materials properties, npj Comput. Mater. 7, 28 (2021)
E. Minamitani, T. Shiga, M. Kashiwagi, and I. Obayashi, Topological descriptor of thermal conductivity in amorphous Si, J. Chem. Phys. 156(24), 244502 (2022)
M. E. Aktas, E. Akbas, and A. E. Fatmaoui, Persistence homology of networks: Methods and applications, Appl. Netw. Sci. 4(1), 1 (2019)
A. Ziletti, D. Kumar, M. Scheffler, and L. M. Ghir-inghelli, Insightful classification of crystal structures using deep learning, Nat. Commun. 9(1), 2775 (2018)
W. B. Park, J. Chung, J. Jung, K. Sohn, S. P. Singh, M. Pyo, N. Shin, and K. S. Sohn, Classification of crystal structure using a convolutional neural network, IUCrJ 4(4), 486 (2017)
Y. Zhang, X. He, Z. Chen, Q. Bai, A. M. Nolan, C. A. Roberts, D. Banerjee, T. Matsunaga, Y. Mo, and C. Ling, Unsupervised discovery of solid-state lithium ion conductors, Nat. Commun. 10(1), 5260 (2019)
S. C. Sieg, C. Suh, T. Schmidt, M. Stukowski, K. Rajan, and W. F. Maier, Principal component analysis of catalytic functions in the composition space of heterogeneous catalysts, QSAR Comb. Sci. 26(4), 528 (2007)
R. Tranås, O. M. Løvvik, O. Tomic, and K. Berland, Lattice thermal conductivity of half-Heuslers with density functional theory and machine learning: Enhancing predictivity by active sampling with principal component analysis, Comput. Mater. Sci. 202, 110938 (2022)
L. M. Ghiringhelli, J. Vybiral, E. Ahmetcik, R. Ouyang, S. V. Levchenko, C. Draxl, and M. Scheffler, Learning physical descriptors for materials science by compressed sensing, New J. Phys. 19(2), 023017 (2017)
R. Ouyang, S. Curtarolo, E. Ahmetcik, M. Scheffler, and L. M. Ghiringhelli, SISSO: A compressed-sensing method for identifying the best low-dimensional descriptor in an immensity of offered candidates, Phys. Rev. Mater. 2, 083802 (2018)
W. C. Lu, X. B. Ji, M. J. Li, L. Liu, B. H. Yue, and L. M. Zhang, Using support vector machine for materials design, Adv. Manuf. 1(2), 151 (2013)
Y. Wu, N. Prezhdo, and W. Chu, Increasing efficiency of nonadiabatic molecular dynamics by Hamiltonian interpolation with kernel ridge regression, J. Phys. Chem. A 125(41), 9191 (2021)
T. Hastie, R. Tibshirani, and J. H. Friedman, The elements of statistical learning: Data mining, inference, and prediction, 2nd Ed., in: Springer series in statistics, NY: Springer, 2009
K. He, X. Zhang, S. Ren, and J. Sun, Deep residual learning for image recognition, in: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp 770–778
O. T. Unke, S. Chmiela, M. Gastegger, K. T. Schütt, H. E. Sauceda, and K.-R. Müller, Spookynet: Learning force fields with electronic degrees of freedom and nonlocal effects, Nat. Commun. 12, 7273 (2021)
C. Zheng, C. Chen, Y. Chen, and S. P. Ong, Random forest models for accurate identification of coordination environments from X-ray absorption near-edge structure, Patterns 1(2), 100013 (2020)
J. J. Kranz, M. Kubillus, R. Ramakrishnan, O. A. von Lilienfeld, and M. Elstner, Generalized density-functional tight-binding repulsive potentials from unsupervised machine learning, J. Chem. Theory Comput. 14(5), 2341 (2018)
S. Kim, J. Noh, G. H. Gu, A. Aspuru-Guzik, and Y. Jung, Generative adversarial networks for crystal structure prediction, ACS Cent. Sci. 6(8), 1412 (2020)
J. Noh, J. Kim, H. S. Stein, B. Sanchez-Lengeling, J. M. Gregoire, A. Aspuru-Guzik, and Y. Jung, Inverse design of solid-state materials via a continuous representation, Matter 1(5), 1370 (2019)
M. L. Hutchinson, E. Antono, B. M. Gibbons, S. Paradiso, J. Ling, and B. Meredig, Overcoming data scarcity with transfer learning, arXiv: 1711.05099 (2017)
R. Chang, Y.-X. Wang, and E. Ertekin, Towards overcoming data scarcity in materials science: Unifying models and datasets with a mixture of experts frame-work, npj Comput. Mater. 8, 242 (2022)
M. A. Nielsen, Neural Networks and Deep Learning, Determination Press, 2015
A. Akbari, L. Ng, and B. Solnik, Drivers of economic and financial integration: A machine learning approach, J. Empir. Finance 61, 82 (2021)
L. Weng, Flow-based deep generative models, URL: lilianweng.github.io, 2018
P. Raccuglia, K. C. Elbert, P. D. F. Adler, C. Falk, M. B. Wenny, A. Mollo, M. Zeller, S. A. Friedler, J. Schrier, and A. J. Norquist, Machine-learning-assisted materials discovery using failed experiments, Nature 533(7601), 73 (2016)
A. O. Oliynyk, L. A. Adutwum, J. J. Harynuk, and A. Mar, Classifying crystal structures of binary compounds AB through cluster resolution feature selection and support vector machine analysis, Chem. Mater. 28(18), 6672 (2016)
J. Tang, Q. Cai, and Y. Liu, Prediction of material mechanical properties with support vector machine, in: 2010 International Conference on Machine Vision and Human-machine Interface, Aprl 2010, pp 592–595
D. C. Elton, Z. Boukouvalas, M. S. Butrico, M. D. Fuge, and P. W. Chung, Applying machine learning techniques to predict the properties of energetic materials, Sci. Rep. 8(1), 9059 (2018)
D. Hu, Y. Xie, X. Li, L. Li, and Z. Lan, Inclusion of machine learning kernel ridge regression potential energy surfaces in on-the-fly nonadiabatic molecular dynamics simulation, J. Phys. Chem. Lett. 9(11), 2725 (2018)
K. T. Schütt, F. Arbabzadah, S. Chmiela, K. R. Müller, and A. Tkatchenko, Quantum-chemical insights from deep tensor neural networks, Nat. Commun. 8(1), 13890 (2017)
D. Jha, L. Ward, A. Paul, W.-K. Liao, A. Choudhary, C. Wolverton, and A. Agrawal, Elemnet: Deep learning the chemistry of materials from only elemental composition, Sci. Rep. 8, 17593 (2018)
D. Jha, L. Ward, Z. Yang, C. Wolverton, I. Foster, W. K. Liao, A. Choudhary, and A. Agrawal, IRNet: A general purpose deep residual regression framework for materials discovery, in: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp 2385–2393, 2019
O. T. Unke and M. Meuwly, Physnet: A neural network for predicting energies, forces, dipole moments, and partial charges, J. Chem. Theo. Comput. 15(6), 3678 (2019)
Z. Liu, L. Lin, Q. Jia, Z. Cheng, Y. Jiang, Y. Guo, and J. Ma, Transferable multilevel attention neural network for accurate prediction of quantum chemistry properties via multitask learning, J. Chem. Inform. Model. 61(3), 1066 (2021)
A. M. Krajewski, J. W. Siegel, J. Xu, and Z. K. Liu, Extensible structure-informed prediction of formation energy with improved accuracy and usability employing neural networks, Comput. Mater. Sci. 208, 111254 (2022)
K. T. Schütt, P. J. Kindermans, H. E. Sauceda, S. Chmiela, A. Tkatchenko, and K. R. Müller, SchNet: A continuous-filter convolutional neural network for modeling quantum interactions, in Proceedings of the 31st International Conference on Neural Information Processing Systems, in NIPS’17. Red Hook, NY, USA: Curran Associates Inc., Dec. 2017, pp 992–1002
J. Jung, et al., Super-resolving material microstructure image via deep learning for microstructure characterization and mechanical behavior analysis, npj Comput. Mater. 7, 96 (2021)
A. A. K. Farizhandi, O. Betancourt, and M. Mamivand, Deep learning approach for chemistry and processing history prediction from materials microstructure, Sci. Rep. 12(1), 4552 (2022)
T. Xie and J. C. Grossman, Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties, Phys. Rev. Lett. 120(14), 145301 (2018)
C. Chen, W. Ye, Y. Zuo, C. Zheng, and S. P. Ong, Graph networks as a universal machine learning framework for molecules and crystals, Chem. Mater. 31(9), 3564 (2019)
S. Y. Louis, Y. Zhao, A. Nasiri, X. Wang, Y. Song, F. Liu, and J. Hu, Graph convolutional neural networks with global attention for improved materials property prediction, Phys. Chem. Chem. Phys. 22(32), 18141 (2020)
Z. Qiao, M. Welborn, A. Anandkumar, F. R. Manby, and T. F. Miller, OrbNet: Deep learning for quantum chemistry using symmetry-adapted atomic-orbital features, J. Chem. Phys. 153(12), 124111 (2020)
J. Gasteiger, J. Groß, and S. Günnemann, Directional message passing for molecular graphs, arXiv: 2003.03123 (2020)
K. Choudhary and B. DeCost, Atomistic line graph neural network for improved materials property predictions, npj Comput. Mater. 7(1), 185 (2021)
S. Zhang, Y. Liu, and L. Xie, Molecular mechanics-driven graph neural network with multiplex graph for molecular structures, arXiv: 2011.07457 (2020)
M. Ghorbani, S. Prasad, J. B. Klauda, and B. R. Brooks, GraphVAMPNet, using graph neural networks and variational approach to Markov processes for dynamical modeling of biomolecules, J. Chem. Phys. 156(18), 184103 (2022)
T. Xie, A. France-Lanord, Y. Wang, Y. Shao-Horn, and J. C. Grossman, Graph dynamical networks for unsupervised learning of atomic scale dynamics in materials, Nat. Commun. 10(1), 2667 (2019)
S. Batzner, A. Musaelian, L. Sun, M. Geiger, J. P. Mailoa, M. Kornbluth, N. Molinari, T. E. Smidt, and B. Kozinsky, E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials, Nat. Commun. 13, 2453 (2022)
K. T. Schütt, O. T. Unke, and M. Gastegger, Equivariant message passing for the prediction of tensorial properties and molecular spectra, in: International Conference on Machine Learning, pp 9377–9388, 2021
Y. Jiang, Z. Yang, J. Guo, H. Li, Y. Liu, Y. Guo, M. Li, and X. Pu, Coupling complementary strategy to flexible graph neural network for quick discovery of coformer in diverse co-crystal materials, Nat. Commun. 12(1), 5950 (2021)
C. W. Park and C. Wolverton, Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery, Phys. Rev. Mater. 4(6), 063801 (2020)
G. P. Ren, Y. J. Yin, K. J. Wu, and Y. He, Force field-inspired molecular representation learning for property prediction, J. Cheminform. 15(1), 17 (2023)
C. Chen and S. P. Ong, AtomSets as a hierarchical transfer learning framework for small and large materials datasets, npj Comput. Mater. 7, 173 (2021)
H. Yamada, C. Liu, S. Wu, Y. Koyama, S. Ju, J. Shiomi, J. Morikawa, and R. Yoshida, Predicting materials properties with little data using shotgun transfer learning, ACS Cent. Sci. 5(10), 1717 (2019)
S. Feng, H. Fu, H. Zhou, Y. Wu, Z. Lu, and H. Dong, A general and transferable deep learning framework for predicting phase formation in materials, npj Comput. Mater. 7(1), 10 (2021)
V. Gupta, K. Choudhary, F. Tavazza, C. Campbell, W. K. Liao, A. Choudhary, and A. Agrawal, Cross-property deep transfer learning framework for enhanced predictive analytics on small materials data, Nat. Commun. 12, 6595 (2021)
V. Stanev, C. Oses, A. G. Kusne, E. Rodriguez, J. Paglione, S. Curtarolo, and I. Takeuchi, Machine learning modeling of superconducting critical temperature, npj Comput. Mater. 4(1), 29 (2018)
D. S. Palmer, N. M. O’Boyle, R. C. Glen, and J. B. O. Mitchell, Random forest models to predict aqueous solubility, J. Chem. Inform. Model. 47(1), 150 (2007)
P. Banerjee and R. Preissner, Bittersweetforest: A random forest based binary classifier to predict bitterness and sweetness of chemical compounds, Front. Chem. 6, 93 (2018)
P. Raccuglia, K. C. Elbert, P. D. F. Adler, C. Falk, M. B. Wenny, A. Mollo, M. Zeller, S. A. Friedler, J. Schrier, and A. J. Norquist, Machine-learning-assisted materials discovery using failed experiments, Nature 533(7601), 7601 (2016)
L. Chen, B. Xu, J. Chen, K. Bi, C. Li, S. Lu, G. Hu, and Y. Lin, Ensemble-machine-learning-based correlation analysis of internal and band characteristics of thermoelectric materials, J. Mater. Chem. C 8(37), 13079 (2020)
J. Venderley, K. Mallayya, M. Matty, M. Krogstad, J. Ruff, G. Pleiss, V. Kishore, D. Mandrus, D. Phelan, L. Poudel, A. G. Wilson, K. Weinberger, P. Upreti, M. Norman, S. Rosenkranz, R. Osborn, and E. A. Kim, Harnessing interpretable and unsupervised machine learning to address big data from modern X-ray diffraction, Proc. Natl. Acad. Sci. USA 119(24), e2109665119 (2022)
R. Cohn and E. Holm, Unsupervised machine learning via transfer learning and k-means clustering to classify materials image data, Integr. Mater. Manuf. Innov. 10(2), 231 (2021)
R. E. A. Goodall and A. A. Lee, Predicting materials properties without crystal structure: Deep representation learning from stoichiometry, Nat. Commun. 11(1), 6280 (2020)
K. Muraoka, Y. Sada, D. Miyazaki, W. Chaikittisilp, and T. Okubo, Linking synthesis and structure descriptors from a large collection of synthetic records of zeolite materials, Nat. Commun. 10(1), 4459 (2019)
D. Jha, K. Choudhary, F. Tavazza, W. Liao, A. Choudhary, C. Campbell, and A. Agrawal, Enhancing materials property prediction by leveraging computational and experimental data using deep transfer learning, Nat. Commun. 10(1), 5316 (2019)
X. Zhong, B. Gallagher, S. Liu, B. Kailkhura, A. Hiszpanski, and T. Y.-J. Han, Explainable machine learning in materials science, npj Comput. Mater. 8, 204 (2022)
P. Linardatos, V. Papastefanopoulos, and S. Kotsiantis, Explainable AI: A review of machine learning interpretability methods, Entropy (Basel) 23(1), 18 (2020)
W. J. Murdoch, C. Singh, K. Kumbier, R. Abbasi-Asl, and B. Yu, Definitions, methods, and applications in interpretable machine learning, Proc. Natl. Acad. Sci. USA 116(44), 22071 (2019)
R. Kondo, S. Yamakawa, Y. Masuoka, S. Tajima, and R. Asahi, Microstructure recognition using convolutional neural networks for prediction of ionic conductivity in ceramics, Acta Mater. 141, 29 (2017)
K. Das, B. Samanta, P. Goyal, S.-C. Lee, S. Bhattacharjee, and N. Ganguly, CrysXPP: An explainable property predictor for crystalline materials, npj Comput. Mater. 8, 43 (2022)
A. Y. T. Wang, S. K. Kauwe, R. J. Murdock, and T. D. Sparks, Compositionally restricted attention-based network for materials property predictions, npj Comput. Mater. 7(1), 77 (2021)
A. Y. T. Wang, M. S. Mahmoud, M. Czasny, and A. Gurlo, CrabNet for explainable deep learning in materials science: bridging the gap between academia and industry, Integr. Mater. Manuf. Innov. 11(1), 41 (2022)
A. Parnami and M. Lee, Learning from few examples: A summary of approaches to few-shot learning, arXiv: 2203.04291 (2023)
Y. Wang, Q. Yao, J. T. Kwok, and L. M. Ni, Generalizing from a few examples: A survey on few-shot learning, ACM Comput. Surv. 53(3), 63 (2020)
Y. Wang, A. Abuduweili, Q. Yao, and D. Dou, Property-aware relation networks for few-shot molecular property prediction, arXiv: 2107.07994 (2021)
Z. Guo, et al., Few-shot graph learning for molecular property prediction, in: Proceedings of the Web Conference 2021, in: WWW’ 21. New York, USA: Association for Computing Machinery, June 2021, pp 2559–2567
K. Kaufmann, H. Lane, X. Liu, and K. S. Vecchio, Efficient few-shot machine learning for classification of EBSD patterns, Sci. Rep. 11(1), 8172 (2021)
S. Akers, et al., Rapid and flexible segmentation of electron microscopy data using few-shot machine learning, npj Comput. Mater. 7, 187 (2021)
J. P. Perdew and K. Schmidt, Jacob’s ladder of density functional approximations for the exchange-correlation energy, AIP Conf. Proc. 577, 1 (2001)
S. Dick and M. Fernandez-Serra, Machine learning accurate exchange and correlation functionals of the electronic density, Nat. Commun. 11(1), 3509 (2020)
R. Nagai, R. Akashi, and O. Sugino, Completing density functional theory by machine learning hidden messages from molecules, npj Comput. Mater. 6(1), 43 (2020)
J. Kirkpatrick, B. McMorrow, D. H. P. Turban, A. L. Gaunt, J. S. Spencer, A. G. D. G. Matthews, A. Obika, L. Thiry, M. Fortunato, D. Pfau, L. R. Castellanos, S. Petersen, A. W. R. Nelson, P. Kohli, P. Mori-Sánchez, D. Hassabis, and A. J. Cohen, Pushing the frontiers of density functionals by solving the fractional electron problem, Science 374(6573), 1385 (2021)
J. C. Snyder, M. Rupp, K. Hansen, K. R. Müller, and K. Burke, Finding density functionals with machine learning, Phys. Rev. Lett. 108(25), 253002 (2012)
X. Lei and A. J. Medford, Design and analysis of machine learning exchange-correlation functionals via rotationally invariant convolutional descriptors, Phys. Rev. Mater. 3(6), 063801 (2019)
Z. Fan, Y. Wang, P. Ying, K. Song, J. Wang, Y. Wang, Z. Zeng, K. Xu, E. Lindgren, J. M. Rahm, A. J. Gabourie, J. Liu, H. Dong, J. Wu, Y. Chen, Z. Zhong, J. Sun, P. Erhart, Y. Su, and T. Ala-Nissila, GPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations, J. Chem. Phys. 157(11), 114801 (2022)
H. Wang, L. Zhang, J. Han, and W. E, DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics, Comput. Phys. Commun. 228, 178 (2018)
Y. Zhang, H. Wang, W. Chen, J. Zeng, L. Zhang, H. Wang, and W. E, DP-GEN: A concurrent learning platform for the generation of reliable deep learning based potential energy models, Comput. Phys. Commun. 253, 107206 (2020)
P. Pattnaik, S. Raghunathan, T. Kalluri, P. Bhimalapuram, C. V. Jawahar, and U. D. Priyakumar, Machine learning for accurate force calculations in molecular dynamics simulations, J. Phys. Chem. A 124(34), 6954 (2020)
J. Westermayr and P. Marquetand, Machine learning and excited-state molecular dynamics, Mach. Learn.: Sci. Technol. 1(4), 043001 (2020)
G. Fan, A. McSloy, B. Aradi, C. Y. Yam, and T. Frauenheim, Obtaining electronic properties of molecules through combining density functional tight binding with machine learning, J. Phys. Chem. Lett. 13(43), 10132 (2022)
Z. Ahmad, T. Xie, C. Maheshwari, J. C. Grossman, and V. Viswanathan, Machine learning enabled computational screening of inorganic solid electrolytes for suppression of dendrite formation in lithium metal anodes, ACS Cent. Sci. 4(8), 996 (2018)
S. Gong, S. Wang, T. Zhu, X. Chen, Z. Yang, M. J. Buehler, Y. Shao-Horn, and J. C. Grossman, Screening and understanding Li adsorption on two-dimensional metallic materials by learning physics and physics-simplified learning, JACS Au 1(11), 1904 (2021)
T. Xie, A. France-Lanord, Y. Wang, J. Lopez, M. A. Stolberg, M. Hill, G. M. Leverick, R. Gomez-Bombarelli, J. A. Johnson, Y. Shao-Horn, and J. C. Grossman, Accelerating amorphous polymer electrolyte screening by learning to reduce errors in molecular dynamics simulated properties, Nat. Commun. 13(1), 3415 (2022)
K. Gubaev, E. V. Podryabinkin, G. L. Hart, and A. V. Shapeev, Accelerating high-throughput searches for new alloys with active learning of interatomic potentials, Comput. Mater. Sci. 156, 148 (2019)
T. Xie, X. Fu, O. E. Ganea, R. Barzilay, and T. Jaakkola, Crystal diffusion variational autoencoder for periodic material generation, arXiv: 2110.06197 (2021)
Y. Dong, D. Li, C. Zhang, C. Wu, H. Wang, M. Xin, J. Cheng, and J. Lin, Inverse design of two-dimensional graphene/h-BN hybrids by a regressional and conditional GAN, Carbon 169, 9 (2020)
Y. Pathak, K. S. Juneja, G. Varma, M. Ehara, and U. D. Priyakumar, Deep learning enabled inorganic material generator, Phys. Chem. Chem. Phys. 22(46), 26935 (2020)
Y. Suzuki, H. Hino, T. Hawai, K. Saito, M. Kotsugi, and K. Ono, Symmetry prediction and knowledge discovery from X-ray diffraction patterns using an interpretable machine learning approach, Sci. Rep. 10(1), 21790 (2020)
A. A. Enders, N. M. North, C. M. Fensore, J. Velez-Alvarez, and H. C. Allen, Functional group identification for FTIR spectra using image-based machine learning models, Anal. Chem. 93(28), 9711 (2021)
B. Huang, Z. Li, and J. Li, An artificial intelligence atomic force microscope enabled by machine learning, Nanoscale 10(45), 21320 (2018)
A. Chandrashekar, P. Belardinelli, M. A. Bessa, U. Staufer, and F. Alijani, Quantifying nanoscale forces using machine learning in dynamic atomic force microscopy, Nanoscale Adv. 4(9), 2134 (2022)
S. V. Kalinin, C. Ophus, P. M. Voyles, R. Erni, D. Kepaptsoglou, V. Grillo, A. R. Lupini, M. P. Oxley, E. Schwenker, M. K. Y. Chan, J. Etheridge, X. Li, G. G. D. Han, M. Ziatdinov, N. Shibata, and S. J. Pennycook, Machine learning in scanning transmission electron microscopy, Nat. Rev. Methods Primers 2(1), 11 (2022)
J. Jung, et al., Super-resolving material microstructure image via deep learning for microstructure characterization and mechanical behavior analysis, npj Comput. Mater. 7, 96 (2021)
L. Floridi and M. Chiriatti, GPT-3: Its nature, scope, limits, and consequences, Minds Mach. 30(4), 681 (2020)
OpenAI, GPT-4 Technical Report, arXiv: 2303.08774 (2023)
D. M. Katz, M. J. Bommarito, S. Gao, and P. Arredondo, GPT-4 passes the bar exam, Rochester, NY, Mar. 15, 2023
V. Tshitoyan, J. Dagdelen, L. Weston, A. Dunn, Z. Rong, O. Kononova, K. A. Persson, G. Ceder, and A. Jain, Unsupervised word embeddings capture latent knowledge from materials science literature, Nature 571(7763), 95 (2019)
E. A. Olivetti, J. M. Cole, E. Kim, O. Kononova, G. Ceder, T. Y.-J. Han, and A. M. Hiszpanski, Data-driven materials research enabled by natural language processing and information extraction, Appl. Phys. Rev. 7(4), 041317 (2020)
P. Shetty and R. Ramprasad, Automated knowledge extraction from polymer literature using natural language processing, iScience 24(1), 101922 (2021)
A. Davies, P. Veličković, L. Buesing, S. Blackwell, D. Zheng, N. Tomašev, R. Tanburn, P. Battaglia, C. Blundell, A. Juhász, M. Lackenby, G. Williamson, D. Hassabis, and P. Kohli, Advancing mathematics by guiding human intuition with AI, Nature 600(7887), 70 (2021)
G. E. Karniadakis, I. G. Kevrekidis, L. Lu, P. Perdikaris, S. Wang, and L. Yang, Physics informed machine learning, Nat. Rev. Phys. 3(6), 422 (2021)
A. Goyal and Y. Bengio, Inductive biases for deep learning of higher-level cognition, Proc. R. Soc. A 478(2266), 20210068 (2022)
B. Baker, I. Akkaya, P. Zhokhov, J. Huizinga, J. Tang, A. Ecoffet, B. Houghton, R. Sampedro, and J. Clune, Video pretraining (VPT): Learning to act by watching unlabeled online videos, Advances in Neural Information Processing Systems 35, 24639 (2022)
J. Lehman, J. Gordon, S. Jain, K. Ndousse, C. Yeh, and K. O. Stanley, Evolution through large models, arXiv: 2206.08896 (2022)
M. S. Anis, et al., Qiskit: An open-source framework for quantum computing, 2021
C. Wu, F. Wu, L. Lyu, Y. Huang, and X. Xie, Communication-efficient federated learning via knowledge distillation, Nat. Commun. 13, 2032 (2022)
H. G. Yu, Neural network iterative diagonalization method to solve eigenvalue problems in quantum mechanics, Phys. Chem. Chem. Phys. 17(21), 14071 (2015)
S. K. Ghosh and D. Ghosh, Machine learning matrix product state Ansatz for strongly correlated systems, J. Chem. Phys. 158(6), 064108 (2023)
P. C. H. Nguyen, J. B. Choi, H. S. Udaykumar, and S. Baek, Challenges and opportunities for machine learning in multiscale computational modeling, J. Comput. Inf. Sci. Eng. 23(6), 060808 (2023)
H. Wahab, V. Jain, A. S. Tyrrell, M. A. Seas, L. Kotthoff, and P. A. Johnson, Machine-learning-assisted fabrication: Bayesian optimization of laser-induced graphene patterning using in-situ Raman analysis, Carbon 167, 609 (2020)
A. Tayyebi, A. S. Alshami, X. Yu, and E. Kolodka, Can machine learning methods guide gas separation membranes fabrication, J. Membrane Sci. Lett. 2(2), 100033 (2022)
Y. T. Chen, M. Duquesnoy, D. H. S. Tan, J. M. Doux, H. Yang, G. Deysher, P. Ridley, A. A. Franco, Y. S. Meng, and Z. Chen, Fabrication of high-quality thin solid-state electrolyte films assisted by machine learning, ACS Energy Lett. 6(4), 1639 (2021)
W. Li, L. Liang, S. Zhao, S. Zhang, and J. Xue, Fabrication of nanopores in a graphene sheet with heavy ions: A molecular dynamics study, J. Appl. Phys. 114(23), 234304 (2013)
L. L. Safina and J. A. Baimova, Molecular dynamics simulation of fabrication of Ni-graphene composite: Temperature effect, Micro & Nano Lett. 15(3), 176 (2020)
B. Zhao, C. Shen, H. Yan, J. Xie, X. Liu, Y. Dai, J. Zhang, J. Zheng, L. Wu, Y. Zhu, and Y. Jiang, Constructing uniform oxygen defect engineering on primary particle level for high-stability lithium-rich cathode materials, Chem. Eng. J. 465, 142928 (2023)
X. X. Liao, H. Q. Wang, and J. C. Zheng, Tuning the structural, electronic, and magnetic properties of strontium titanate through atomic design: A comparison between oxygen vacancies and nitrogen doping, J. Am. Ceram. Soc. 96(2), 538 (2013)
H. Xing, H. Q. Wang, T. Song, C. Li, Y. Dai, G. Fu, J. Kang, and J. C. Zheng, Electronic and thermal properties of Ag-doped single crystal zinc oxide via laser-induced technique, Chin. Phys. B 32(6), 066107 (2023)
L. Wu, J. C. Zheng, J. Zhou, Q. Li, J. Yang, and Y. Zhu, Nanostructures and defects in thermoelectric AgPb18SbTe20 single crystal, J. Appl. Phys. 105(9), 094317 (2009)
H. Zeng, M. Wu, H. Q. Wang, J. C. Zheng, and J. Kang, Tuning the magnetic and electronic properties of strontium titanate by carbon doping, Front. Phys. 16(4), 43501 (2021)
D. Li, H. Q. Wang, H. Zhou, Y. P. Li, Z. Huang, J. C. Zheng, J. O. Wang, H. Qian, K. Ibrahim, X. Chen, H. Zhan, Y. Zhou, and J. Kang, Influence of nitrogen and magnesium doping on the properties of ZnO films, Chin. Phys. B 25(7), 076105 (2016)
R. Wang and J. C. Zheng, Promising transition metal decorated borophene catalyst for water splitting, RSC Advances 13(14), 9678 (2023)
J. He, L. D. Zhao, J. C. Zheng, J. W. Doak, H. Wu, H. Q. Wang, Y. Lee, C. Wolverton, M. G. Kanatzidis, and V. P. Dravid, Role of sodium doping in lead chalcogenide thermoelectrics, J. Am. Chem. Soc. 135(12), 4624 (2013)
L. D. Cooley, A. J. Zambano, A. R. Moodenbaugh, R. F. Klie, J. C. Zheng, and Y. Zhu, Inversion of two-band superconductivity at the critical electron doping of (Mg, Al)B2, Phys. Rev. Lett. 95(26), 267002 (2005)
H. Yan, T. Wang, L. Liu, T. Song, C. Li, L. Sun, L. Wu, J. C. Zheng, and Y. Dai, High voltage stable cycling of all-solid-state lithium metal batteries enabled by top-down direct fluorinated poly (ethylene oxide)-based electrolytes, J. Power Sources 557, 232559 (2023)
J. C. Zheng, C. H. A. Huan, A. T. S. Wee, R. Z. Wang, and Y. M. Zheng, Ground-state properties of cubic CBN solid solutions, J. Phys.: Condens. Matter 11(3), 927 (1999)
Z. Huang, T. Y. Lü, H. Q. Wang, S. W. Yang, and J. C. Zheng, Electronic and thermoelectric properties of the group-III nitrides (BN, AlN and GaN) atomic sheets under biaxial strains, Comput. Mater. Sci. 130, 232 (2017)
T. Y. Lü, X. X. Liao, H. Q. Wang, and J. C. Zheng, Tuning the indirect–direct band gap transition of SiC, GeC and SnC monolayer in a graphene-like honeycomb structure by strain engineering: A quasiparticle GW study, J. Mater. Chem. 22(19), 10062 (2012)
J. C. Zheng and J. W. Davenport, Ferromagnetism and stability of half-metallic MnSb and MnBi in the strained zinc-blende structure: Predictions from full potential and pseudopotential calculations, Phys. Rev. B 69(14), 144415 (2004)
L. Xu, H. Q. Wang, and J. C. Zheng, Thermoelectric properties of PbTe, SnTe, and GeTe at high pressure: An ab initio study, J. Electron. Mater. 40(5), 641 (2011)
L. Xu, Y. Zheng, and J. C. Zheng, Thermoelectric transport properties of PbTe under pressure, Phys. Rev. B 82(19), 195102 (2010)
J. C. Zheng, Superhard hexagonal transition metal and its carbide and nitride:Os, OsC, and OsN, Phys. Rev. B 72(5), 052105 (2005)
T. Sun, K. Umemoto, Z. Wu, J. C. Zheng, and R. M. Wentzcovitch, Lattice dynamics and thermal equation of state of platinum, Phys. Rev. B 78(2), 024304 (2008)
Z. Wu, R. M. Wentzcovitch, K. Umemoto, B. Li, K. Hirose, and J. C. Zheng, Pressure-volume-temperature relations in MgO: An ultrahigh pressure-temperature scale for planetary sciences applications, J. Geophys. Res. 113(B6), B06204 (2008)
S. Deng, L. Wu, H. Cheng, J. C. Zheng, S. Cheng, J. Li, W. Wang, J. Shen, J. Tao, J. Zhu, and Y. Zhu, Charge-lattice coupling in hole-doped LuFe2O4+δ: The origin of second-order modulation, Phys. Rev. Lett. 122(12), 126401 (2019)
J. C. Zheng, L. Wu, Y. Zhu, and J. W. Davenport, On the sensitivity of electron and X-ray scattering factors to valence charge distribution, J. Appl. Crystall. 38, 648 (2005)
J. C. Zheng and H. Q. Wang, Principles and applications of a comprehensive characterization method combining synchrotron radiation technology, transmission electron microscopy, and density functional theory, Sci. Sin. - Phys. Mech. & Astron. 51(3), 030007 (2021)
Acknowledgements
This research was supported by the Ministry of Higher Education Malaysia through the Fundamental Research Grant Scheme (No. FRGS/1/2021/STG05/XMU/01/1).
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Declarations The authors declare that they have no competing interests and there are no conflicts.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, duplication, adaptation, distribution, and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provided a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Chong, S.S., Ng, Y.S., Wang, HQ. et al. Advances of machine learning in materials science: Ideas and techniques. Front. Phys. 19, 13501 (2024). https://doi.org/10.1007/s11467-023-1325-z
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11467-023-1325-z