TY - GEN
T1 - An independent approach to training classifiers on physiological data
T2 - 25th International Conference on Neural Information Processing, ICONIP 2018
AU - Hossain, Md Zakir
AU - Gedeon, Tom D.
N1 - Publisher Copyright:
© 2018, Springer Nature Switzerland AG.
PY - 2018
Y1 - 2018
N2 - Training neural network and other classifiers on physiological signals has challenges beyond more traditional datasets, as the training data includes data points which are not independent. Most obviously, more than one sample can come from a particular human subject. Standard cross-validation as implemented in many AI tools gives artificially high results as the common human subject is not considered. This is handled by some papers in the literature, by using leave-one-subject-out cross-validation. We argue that this is not sufficient, and introduce our independent approach, which is leave-one-subject-and-one-stimulus-out cross-validation. We demonstrate our approach using KNN, SVM and NN classifiers and their ensemble, using an extended example of physiological recordings from subjects observing genuine versus posed smiles, which are the two kinds of the nicest smiles and hard for people to differentiate reliably. We use three physiological signals, 20 video stimuli and 24 observers/participants, achieving 96.1% correct results, in a truly robust fashion.
AB - Training neural network and other classifiers on physiological signals has challenges beyond more traditional datasets, as the training data includes data points which are not independent. Most obviously, more than one sample can come from a particular human subject. Standard cross-validation as implemented in many AI tools gives artificially high results as the common human subject is not considered. This is handled by some papers in the literature, by using leave-one-subject-out cross-validation. We argue that this is not sufficient, and introduce our independent approach, which is leave-one-subject-and-one-stimulus-out cross-validation. We demonstrate our approach using KNN, SVM and NN classifiers and their ensemble, using an extended example of physiological recordings from subjects observing genuine versus posed smiles, which are the two kinds of the nicest smiles and hard for people to differentiate reliably. We use three physiological signals, 20 video stimuli and 24 observers/participants, achieving 96.1% correct results, in a truly robust fashion.
KW - Affective computing
KW - Independent approach
KW - Observers
KW - Physiological features
KW - Smilers
UR - http://www.scopus.com/inward/record.url?scp=85059080181&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-04179-3_53
DO - 10.1007/978-3-030-04179-3_53
M3 - Conference contribution
SN - 9783030041786
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 603
EP - 613
BT - Neural Information Processing - 25th International Conference, ICONIP 2018, Proceedings
A2 - Leung, Andrew Chi Sing
A2 - Ozawa, Seiichi
A2 - Cheng, Long
PB - Springer Verlag
Y2 - 13 December 2018 through 16 December 2018
ER -