Jens Madsen

From Brede Wiki
Jump to: navigation, search
Researcher (help)
Jens Madsen
Affiliation: Section for Cognitive Systems, Technical University of Denmark
Location: Denmark
Position: PhD Student
Interest(s): Machine learning
Music information retrieval
Databases: Google Scholar Scopus Twitter (cogniemotion)
Link(s): http://www.dtu.dk/Service/Telefonbog/Person?id=37248&cpid=109240&tab=2&qt=dtupublicationquery
Search: PubMed (first author) PubMed
Google Scholar
Scopus
English Wikipedia
Twitter
DuckDuckGo

Jens Madsen is a PhD Student at the Section for Cognitive Systems, Technical University of Denmark, supervized by Jan Larsen (DTU Compute). He has worked with analyzing emotions expressed in music, e.g., for use in recommendation system in music information retrieval.

Madsen claims the following scientific answers to why peple listen to music:

  1. regulate their emotional state
  2. self-awareness and finding identity
  3. social bonding/relatedness

Madsen has worked with a number of different features extracted from music: pitch, chroma (chromagram), mel, loudness, beat, tempo, ...

In his work emotions are modeling with the two-dimensional core affect model with valence and arousal.

For his PhD project he worked with 200 muscial excerpts presented to 14 participants rating valence and arousal on a 9 point likert scale. He also investigated the confidence effect, and let participants rate with absolute scale and relative scale. For predicting valence and arousal from musical features he used Gaussian processes.

[edit] Work

  1. Modeling expressed emotions in music using pairwise comparisons
  2. Modeling temporal structure in music for emotion prediction using pairwise comparisons
  3. Predictive modeling of expressed emotions in music using pairwise comparisons
  4. Towards predicting expressed emotion in music from pairwise comparisons

[edit] Theses

  1. Predicting the emotions expressed in music
Personal tools