The emergent role of explainable artificial intelligence in the materials sciences

Tommy Liu*, Amanda S. Barnard

*Corresponding author for this work

    Research output: Contribution to journalReview articlepeer-review

    6 Citations (Scopus)

    Abstract

    The combination of rational machine learning with creative materials science makes materials informatics a powerful way of discovering, designing, and screening new materials. However, moving from a promising prediction to a practical strategy often requires more than just an instructive structure-property relationship; understanding how a machine learning method uses the structural feature to predict the target properties becomes critical. Explainable artificial intelligence (XAI) is an emerging field in computer science based in statistics that can augment materials informatics workflows. XAI can be used as a forensic analysis to understand the consequences of data, model, and application decisions or as a model refinement method capable of distinguishing important features from nuisance variables. Here, we outline the state of the art in XAI and highlight methods most useful to the physical sciences. This practical guide focuses on characteristics of XAI methods that are relevant to materials informatics and will become increasingly important as more researchers move toward using deeper neural networks and large language models.

    Original languageEnglish
    Article number101630
    Pages (from-to)1-14
    Number of pages14
    JournalCell Reports Physical Science
    Volume4
    Issue number10
    DOIs
    Publication statusPublished - 18 Oct 2023

    Fingerprint

    Dive into the research topics of 'The emergent role of explainable artificial intelligence in the materials sciences'. Together they form a unique fingerprint.

    Cite this