Browsing by Subject "music information retrieval"

Sort by: Order: Results:

Now showing items 1-2 of 2
  • Wargelin, Matias (Helsingin yliopisto, 2021)
    Musical pattern discovery refers to the automated discovery of important repeated patterns, such as melodies and themes, from music data. Several algorithms have been developed to solve this problem, but evaluating the algorithms has been difficult without proper visualisations of the output of the algorithms. To address this issue a web application named Mupadie was built. Mupadie accepts MIDI music files as input and visualises the outputs of musical pattern discovery algorithms, with implementations of SIATEC and TTWIA built in the application. Other algorithms can be visualised if the algorithm output is uploaded to Mupadie as a JSON file that follows a specified data structure. Using Mupadie, an evaluation of SIATEC and TTWIA was conducted. Mupadie was found to be a useful tool in the qualitative evaluation of these musical pattern discovery algorithms; it helped reveal systematically recurring issues with the discovered patterns, some previously known and some previously undocumented. The findings were then used to suggest improvements to the algorithms.
  • Salakka, Ilja (Helsingin yliopisto, 2019)
    Objectives Socioemotional health benefits of music have been recognized for a long time. Especially the ability of music to evoke emotions has led researchers to pay attention to relationships between emotions and specific properties of music. Emotional intensity is also known to be linked to more efficient consolidation and recall of autobiographical memories. Music and autobiographical memories are known to be largely processed by the same neural system, especially in the medial prefrontal cortex. However, the relationship between musical properties and music-evoked autobiographical memories (MEAM) has not been studied before. The first research question of this study was that can some acoustic (musical) features explain the autobiographical salience of the song. The second research question was to determine if that relationship is mediated by subjective emotions evoked by the song, especially the intensity of evoked emotions. Methods Participants (n =113, 86 females) were healthy older adults aged between 60 and 86 years (M = 70.72, SD = 5.39). Participants listened 70 song excerpts during the experiment and rated them on valence, arousal, emotional intensity, familiarity, and autobiographical memories evoked by the song. The musical features of the songs were extracted using music information retrieval (MIR) software, followed by principal component analysis. The relationship between musical features and listeners' ratings was assessed using regression analyses. Main results and conclusions Lower pulse strength, brightenss, and fluctuation in low-middle frequencies were the best predictors of higher autobiographical salience, familiarity and emotional responses evoked by the songs. The intensity of emotions and, to lesser extent, pleasantness had a mediative effect on the relationship between musical features and autobiographical salience. These results add to the still scarce knowledge about MEAMs in the context of specific musical features.