TY - JOUR
T1 - Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
AU - Li, Sichao
AU - Wang, Xin
AU - Barnard, Amanda
N1 - Publisher Copyright:
© 2025 The Author(s). Published by IOP Publishing Ltd.
PY - 2025/3/31
Y1 - 2025/3/31
N2 - Machine learning methods have been remarkably successful in material science, providing novel scientific insights, guiding future laboratory experiments, and accelerating materials discovery. Despite the promising performance of these models, understanding the decisions they make is also essential to ensure the scientific value of their outcomes. However, there is a recent and ongoing debate about the diversity of explanations, which potentially leads to scientific inconsistency. This Perspective explores the sources and implications of these diverse explanations in ML applications for physical sciences. Through three case studies in materials science and molecular property prediction, we examine how different models, explanation methods, levels of feature attribution, and stakeholder needs can result in varying interpretations of ML outputs. Our analysis underscores the importance of considering multiple perspectives when interpreting ML models in scientific contexts and highlights the critical need for scientists to maintain control over the interpretation process, balancing data-driven insights with domain expertise to meet specific scientific needs. By fostering a comprehensive understanding of these inconsistencies, we aim to contribute to the responsible integration of eXplainable artificial intelligence into physical sciences and improve the trustworthiness of ML applications in scientific discovery.
AB - Machine learning methods have been remarkably successful in material science, providing novel scientific insights, guiding future laboratory experiments, and accelerating materials discovery. Despite the promising performance of these models, understanding the decisions they make is also essential to ensure the scientific value of their outcomes. However, there is a recent and ongoing debate about the diversity of explanations, which potentially leads to scientific inconsistency. This Perspective explores the sources and implications of these diverse explanations in ML applications for physical sciences. Through three case studies in materials science and molecular property prediction, we examine how different models, explanation methods, levels of feature attribution, and stakeholder needs can result in varying interpretations of ML outputs. Our analysis underscores the importance of considering multiple perspectives when interpreting ML models in scientific contexts and highlights the critical need for scientists to maintain control over the interpretation process, balancing data-driven insights with domain expertise to meet specific scientific needs. By fostering a comprehensive understanding of these inconsistencies, we aim to contribute to the responsible integration of eXplainable artificial intelligence into physical sciences and improve the trustworthiness of ML applications in scientific discovery.
KW - explanations
KW - physical science
KW - XAI
UR - http://www.scopus.com/inward/record.url?scp=86000279494&partnerID=8YFLogxK
U2 - 10.1088/2632-2153/ad9137
DO - 10.1088/2632-2153/ad9137
M3 - Article
AN - SCOPUS:86000279494
SN - 2632-2153
VL - 6
JO - Machine Learning: Science and Technology
JF - Machine Learning: Science and Technology
IS - 1
M1 - 013002
ER -