Emotion recognition based on customized smart bracelet with built-in accelerometer
- Published
- Accepted
- Subject Areas
- Kinesiology, Psychiatry and Psychology, Computational Science
- Keywords
- Emotion recognition, customized smart bracelet, accelerometer, HCI, SVM
- Copyright
- © 2016 Zhang et al.
- Licence
- This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ PrePrints) and either DOI or URL of the article must be cited.
- Cite this article
- 2016. Emotion recognition based on customized smart bracelet with built-in accelerometer. PeerJ PrePrints 4:e1650v1 https://doi.org/10.7287/peerj.preprints.1650v1
Abstract
Background. In recent years, artificial intelligence (AI) has become an important issue, in which, how to make computers understand the thinking of human being is one of critical topics. If computer could perceive and respond to person's non-verbal language such as emotions, the communication between human beings and computers will be more friendly and naturally. So, more and more researchers are paying attention to realize human daily emotion recognition based on wearable sensor' signals, which could be applied in many applications of health care and Human-Computer-Interaction (HCI). Methods. In this paper, we propose an emotion recognition method, which is based on customized smart bracelet with built-in accelerometer. Those bracelets can be worn on people's ankle and wrist. Firstly, the acceleration data of ankle and wrist are obtained when person is walking naturally. Considering the original acceleration data is noisy and variable, the Moving Average Filter is used to eliminate noise. Besides, walking can be regarded as the repetitive movement of legs and arms, so the collected acceleration data may be redundant. Through analysis, we design a sliding window to divide the whole data into several data slices of the same size, and let the neighboring slices be partially overlapped. In the subsequent operations, each data slice would be taken as one sample. This process can not only avoid high computational requirement caused by redundancy data, and can also expand the number of samples. For each data slice, 114 relevant features for emotion recognition are extracted. Then, Principal Component Analysis (PCA) is applied to select effective attributes. Finally, we built the emotion recognition classifier based on Weka software platform. Taking the same attributes as input, we compared the performance of emotion recognition among some classical classifiers, including Support Vector Machine (SVM), Decision Tree, Random Tree and Random Forest. Results. The classification accuracy is used to evaluate the effectiveness of our proposed emotion recognition method. Overall, SVM outperforms the other classifiers. The two-category classification accuracies of neutral-anger, neutral-happiness and happiness-anger are 91.3%, 88.5% and 88.5% respectively. The accuracy of multi-category classification among neutral, happiness and anger is 81.2%. Discussion. In the comparative experiments, the recognition rates of different emotion states are all above 81%. It is concluded that gait is capable to reveal affective state of minds.
Author Comment
This is a submission to PeerJ for review.