This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ PrePrints) and either DOI or URL of the article must be cited.
Emotion identification, which aims to determine a person's affective state automatically, has immense potential value in many areas, such as action tendency, health care, psychological detection and human-computer (robot) interaction. In this paper, we provide a novel method for identifying emotion from natural walking. After obtaining the three-axis acceleration data of wrist and ankle recorded by smartphone, we run a moving average filter with different window size w, then cut actual data into slices. 114 features are extracted from each slice, and principal component analysis(PCA) is used for feature selection. We train SVM, Decision Tree, Multilayerperception, Random Tree and Random Forest classification models, and compare the accuracy of emotion identification using different datasets (wrist vs. ankle) in different models. Results show that acceleration data from ankle has better performance in emotion identification than wrist. Among different models, SVM has the highest accuracy, 90:31% when differ anger from neutral, 89:76% when differ happy from neutral, and 87:10% when differ anger from happy. The model for identifying anger/neutral/happy yields the best accuracy of 85%-78%-78%. The results show that we could identify peoples emotional states through the gait of walking with high accuracy.