Talk at ICIT within the Colloquium Series

I have been invited by Prof. Kojiro Umezaki to give a talk within the ICIT Colloquium Series on January 12 2016. This will be my first talk on the project at UCI and we will start disclosing some research details. I will be presenting results from my last stretch of research and how these lead to Musical-Moods, with a specific focus on the musical aspects of the project. Prof. Lisa Pearl and Prof. John Crawford will join the talk and describe their perspective involvement in the project.

Tuesday, January 12, 2016
3:30-4:50pm
Contemporary Arts Center 3201 (Colloquium Room)
Presenter: Fabio Paolizzo

*** OPEN TO ALL ***

How can interactive music systems create novel music, support human music-making and comprehension, as well as enhance the audience experience?

This talk provides an overview of a new research program in which UCI is partnering with University of Rome Tor Vergata. We have been awarded funding of approximately $270,000 from the European Commission to bring a visiting scholar from Italy, Dr. Fabio Paolizzo, to UCI for a period of two years to carry out a project titled “A mood-indexed database of scores, lyrics, musical excerpts, vector-based 3D animations, and dance video recordings,” short title “Musical-Moods,” based on his previous research in electroacoustic music, music cognition and computational creativity.

The Musical-Moods project aims at the development of a mood-indexed and multimodal database for use in next generation interactive media systems. The research will include multidisciplinary tools and methods drawn from a broad range of disciplines, including sciences (cognitive science, human-computer interaction, machine learning, natural language processing, signal processing), arts (music, dance, motion capture and 3D animation) and humanities (musicology, history of music, philosophy). In addition to the exciting potential for this work as an enabling platform for intermedia performance, other potential applications include user profiling for media industries, improving access to musical heritage, services for audio on demand, education and training activities and music therapy.

In the School of the Arts, Dr. Paolizzo will be working with faculty and students on various aspects of the research, including applications of motion capture technologies using our Vicon system and development of new tools and methods for using Prof. John Crawford’s Active Space intermedia performance framework in conjunction with the multimodal database. Project activities will be centered in the Performance Capture Studio in the Contemporary Arts Center, which is a 2000 sq. ft. dance studio featuring a 30-camera Vicon motion capture system coupled with a green-screen digital video capture environment. The Performance Capture Studio is dedicated to research and development of advanced technologies for representing human movement.

In the Department of Cognitive Sciences, Dr. Paolizzo will be working with Prof. Lisa Pearl on the automatic classification of emotional content from text, specifically focusing on music lyrics, either matching the emotional content of the music they accompany or providing a counterpoint to it. Identifying the emotional content of the lyrics separately from that of the music provides useful information about how the overall emotional content of a piece is conveyed. This project involves (1) adapting existing wisdom-of-the-crowd approaches to create a music lyrics dataset that covers a variety of emotions, and (2) applying linguistically-informed machine learning techniques to that dataset to automatically identify the emotional content of the lyrics.

For directions and further information, please visit the announcement page on the UCI website.

Leave a Reply

Your email address will not be published. Required fields are marked *