Healthc Inform Res.  2018 Oct;24(4):309-316. 10.4258/hir.2018.24.4.309.

Arousal and Valence Classification Model Based on Long Short-Term Memory and DEAP Data for Mental Healthcare Management

Affiliations
  • 1Department of Computer Science, Graduate School, Sangmyung University, Seoul, Korea.
  • 2Department of Intelligent Engineering Informatics for Human, College of Convergence Engineering, Sangmyung University, Seoul, Korea. dkim@smu.ac.kr

Abstract


OBJECTIVES
Both the valence and arousal components of affect are important considerations when managing mental healthcare because they are associated with affective and physiological responses. Research on arousal and valence analysis, which uses images, texts, and physiological signals that employ deep learning, is actively underway; research investigating how to improve the recognition rate is needed. The goal of this research was to design a deep learning framework and model to classify arousal and valence, indicating positive and negative degrees of emotion as high or low.
METHODS
The proposed arousal and valence classification model to analyze the affective state was tested using data from 40 channels provided by a dataset for emotion analysis using electrocardiography (EEG), physiological, and video signals (the DEAP dataset). Experiments were based on 10 selected featured central and peripheral nervous system data points, using long short-term memory (LSTM) as a deep learning method.
RESULTS
The arousal and valence were classified and visualized on a two-dimensional coordinate plane. Profiles were designed depending on the number of hidden layers, nodes, and hyperparameters according to the error rate. The experimental results show an arousal and valence classification model accuracy of 74.65 and 78%, respectively. The proposed model performed better than previous other models.
CONCLUSIONS
The proposed model appears to be effective in analyzing arousal and valence; specifically, it is expected that affective analysis using physiological signals based on LSTM will be possible without manual feature extraction. In a future study, the classification model will be adopted in mental healthcare management systems.

Keyword

Arousal and Valence Analysis; Supervised Machine Learning; Classification; Machine Learning; DEAP Dataset

MeSH Terms

Arousal*
Classification*
Dataset
Delivery of Health Care*
Electrocardiography
Learning
Machine Learning
Memory, Short-Term*
Methods
Peripheral Nervous System
Supervised Machine Learning

Figure

  • Figure 1 Example of the DEAP dataset.

  • Figure 2 Long short-term memory (LSTM) model for emotion classification.

  • Figure 3 Experimental results using a coordinate plane.

  • Figure 4 The accuracy results of the arousal classification and valence classification model.

  • Figure 5 Accuracy comparison of emotion classification models.


Reference

1. Salamon N, Grimm JM, Horack JM, Newton EK. Application of virtual reality for crew mental health in extended-duration space missions. Acta Astronaut. 2018; 146:117–122.
Article
2. Valmaggia LR, Latif L, Kempton MJ, Rus-Calafell M. Virtual reality in the psychological treatment for mental health problems: an systematic review of recent evidence. Psychiatry Res. 2016; 236:189–195.
Article
3. Faust O, Hagiwara Y, Hong TJ, Lih OS, Acharya UR. Deep learning for healthcare applications based on physiological signals: a review. Comput Methods Programs Biomed. 2018; 161:1–13.
Article
4. Zhang Q, Chen X, Zhan Q, Yang T, Xia A. Respiration-based emotion recognition with deep learning. Comput Ind. 2017; 92-93:84–90.
Article
5. Yin Z, Zhao M, Wang Y, Yang J, Zhang J. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model. Comput Methods Programs Biomed. 2017; 140:93–110.
Article
6. Haeyen S, van Hooren S, van der Veld WM, Hutschemaekers G. Promoting mental health versus reducing mental illness in art therapy with patients with personality disorders: a quantitative study. Arts Psychother. 2018; 58:11–16.
Article
7. Zhang L, Kong M, Li Z. Emotion regulation difficulties and moral judgement in different domains: the mediation of emotional valence and arousal. Pers Individ Dif. 2017; 109:56–60.
Article
8. Healey J, Picard RW. Detecting stress during real-world driving tasks using physiological sensors. IEEE trans Intell Transp Syst. 2005; 6(2):156–166.
Article
9. Xia L, Malik AS, Subhani AR. A physiological signal-based method for early mental-stress detection. Biomed Signal Process Control. 2018; 46:18–32.
Article
10. Lee D, Kim JH, Jung WH, Lee HJ, Lee SG. The study on EEG based emotion recognition using the EMD and FFT. In : Proceedings of the HCI Society of Korea; 2013 Jan 30-Feb 1; Jeongseon, Korea. p. 127–130.
11. Wang D, Shang Y. Modeling physiological data with deep belief networks. Int J Inf Educ Technol. 2013; 3(5):505–511.
12. DEAPdataset: a dataset for emotion analysis using EEG, physiological and video signal [Internet]. London: Queen Mary University of London;c2017. cited at 2018 Oct 1. Available from: http://www.eecs.qmul.ac.uk/mmv/datasets/deap.
13. Kim J, Andre E. Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell. 2008; 30(12):2067–2083.
Article
14. Kreibig SD. Autonomic nervous system activity in emotion: a review. Biol Psychol. 2010; 84(3):394–421.
Article
15. Collet C, Vernet-Maury E, Delhomme G, Dittmar A. Autonomic nervous system response patterns specificity to basic emotions. J Auton Nerv Syst. 1997; 62(1-2):45–57.
Article
16. Ryoo DW, Kim YS, Lee JW. Wearable systems for service based on physiological signals. Conf Proc IEEE Eng Med Biol Soc. 2005; 3:2437–2440.
17. Jerritta S, Murugappan M, Nagarajan R, Wan K. Physiological signals based human emotion recognition: a review. In : Proceedings of 2011 IEEE 7th International Colloquium on Signal Processing and its Applications (CSPA); 2001 Mar 4-6; Penang, Malaysia. p. 410–415.
18. Chang CY, Zheng JY, Wang CJ. Based on support vector regression for emotion recognition using physiological signals. In : Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN); 2010 Jul 18-23; Barcelona, Spain. p. 1–7.
19. Kim JH, Whang MC, Kim YJ, Woo JC. The study on emotion recognition by time dependent parameters of autonomic nervous response. Korean J Sci Emot Sensibility. 2008; 11:637–644.
20. Vogt T, Andre E. Improving automatic emotion recognition from speech via gender differentiation. In : Proceedings of Language Resources and Evaluation Conference (LREC); 2006 May 24-26; Genoa, Italy.
21. Mill A, Allik J, Realo A, Valk R. Age-related differences in emotion recognition ability: a cross-sectional study. Emotion. 2009; 9(5):619–630.
Article
22. Russell JA. A circumplex model of affect. J Pers Soc Psychol. 1980; 39(6):1161–1178.
Article
23. Kwon SJ. Sentiment analysis of movie reviews using the Word2vec and RNN [Master's thesis]. Seoul, Korea: Dongguk University;2009.
24. TensorFlow. Recurrent neural network [Internet]. [place unknown]: TensorFlow;c2018. cited at 2018 Oct 1. Available from: https://www.tensorflow.org/tutorials/sequences/recurrent.
25. Chanel G, Kierkels JJ, Soleymani M, Pun T. Short-term emotion assessment in a recall paradigm. Int J Hum Comput Stud. 2009; 67(8):607–627.
Article
26. Rainville P, Bechara A, Naqvi N, Damasio AR. Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int J Psychophysiol. 2006; 61(1):5–18.
Article
27. Deeplearning4j [Internet]. Ottawa, Canada: Eclipse Foundation;c2018. cited at 2018 Oct 1. Available from https://deeplearning4j.org/about.
28. Machine learning: learning rate, data processing, overfitting [Internet]. place unknown: publisher unknown;c2017. cited at 2018 Oct 1. Available from http://copycode.tistory.com/166.
29. Song SH. The emotion analysis based on long short term memory using the central and autonomic nervous system signals [Master's thesis]. Seoul, Korea: Sangmyung University;2018.
Full Text Links
  • HIR
Actions
Cited
CITED
export Copy
Close
Share
  • Twitter
  • Facebook
Similar articles
Copyright © 2024 by Korean Association of Medical Journal Editors. All rights reserved.     E-mail: koreamed@kamje.or.kr