EmotionBox: A music-element-driven emotional music generation system based on music psychology

  • Kaitong Zheng
  • , Ruijie Meng
  • , Chengshi Zheng
  • , Xiaodong Li
  • , Jinqiu Sang*
  • , Juanjuan Cai
  • , Jie Wang
  • , Xiao Wang*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

14 Scopus citations

Abstract

With the development of deep neural networks, automatic music composition has made great progress. Although emotional music can evoke listeners' different auditory perceptions, only few research studies have focused on generating emotional music. This paper presents EmotionBox -a music-element-driven emotional music generator based on music psychology that is capable of composing music given a specific emotion, while this model does not require a music dataset labeled with emotions as previous methods. In this work, pitch histogram and note density are extracted as features that represent mode and tempo, respectively, to control music emotions. The specific emotions are mapped from these features through Russell's psychology model. The subjective listening tests show that the Emotionbox has a competitive performance in generating different emotional music and significantly better performance in generating music with low arousal emotions, especially peaceful emotion, compared with the emotion-label-based method.

Original languageEnglish
Article number841926
JournalFrontiers in Psychology
Volume13
DOIs
StatePublished - 29 Aug 2022
Externally publishedYes

Keywords

  • auditory perceptions
  • deep neural networks
  • emotional music generation
  • music element
  • music psychology

Fingerprint

Dive into the research topics of 'EmotionBox: A music-element-driven emotional music generation system based on music psychology'. Together they form a unique fingerprint.

Cite this