TY - JOUR
T1 - Supervised-learning guarantee for quantum AdaBoost
AU - Wang, Yabo
AU - Wang, Xin
AU - Qi, Bo
AU - Dong, Daoyi
N1 - Publisher Copyright:
© 2024 American Physical Society.
PY - 2024/11
Y1 - 2024/11
N2 - In the noisy intermediate-scale quantum (NISQ) era, the capabilities of variational quantum algorithms are greatly constrained due to a limited number of qubits and the shallow depth of quantum circuits. We may view these variational quantum algorithms as weak learners in supervised learning. Ensemble methods are general approaches to combining weak learners to construct a strong one in machine learning. In this paper, by focusing on classification, we theoretically establish and numerically verify a learning guarantee for quantum adaptive boosting (AdaBoost). The supervised-learning risk bound describes how the prediction error of quantum AdaBoost on binary classification decreases as the number of boosting rounds and sample size increase. We further empirically demonstrate the advantages of quantum AdaBoost by focusing on a 4-class classification. The quantum AdaBoost not only outperforms several other ensemble methods, but in the presence of noise it can also surpass the ideally noiseless but unboosted primitive classifier after only a few boosting rounds. Our work indicates that in the current NISQ era, introducing appropriate ensemble methods is particularly valuable in improving the performance of quantum machine learning algorithms.
AB - In the noisy intermediate-scale quantum (NISQ) era, the capabilities of variational quantum algorithms are greatly constrained due to a limited number of qubits and the shallow depth of quantum circuits. We may view these variational quantum algorithms as weak learners in supervised learning. Ensemble methods are general approaches to combining weak learners to construct a strong one in machine learning. In this paper, by focusing on classification, we theoretically establish and numerically verify a learning guarantee for quantum adaptive boosting (AdaBoost). The supervised-learning risk bound describes how the prediction error of quantum AdaBoost on binary classification decreases as the number of boosting rounds and sample size increase. We further empirically demonstrate the advantages of quantum AdaBoost by focusing on a 4-class classification. The quantum AdaBoost not only outperforms several other ensemble methods, but in the presence of noise it can also surpass the ideally noiseless but unboosted primitive classifier after only a few boosting rounds. Our work indicates that in the current NISQ era, introducing appropriate ensemble methods is particularly valuable in improving the performance of quantum machine learning algorithms.
UR - https://www.scopus.com/pages/publications/85209391985
U2 - 10.1103/PhysRevApplied.22.054001
DO - 10.1103/PhysRevApplied.22.054001
M3 - Article
AN - SCOPUS:85209391985
SN - 2331-7019
VL - 22
JO - Physical Review Applied
JF - Physical Review Applied
IS - 5
M1 - 054001
ER -