The MUSICAL-MOODS project is aimed at the development of an online database of music scores, music lyrics, audio excerpts, vector-based 3D animations and dance video recordings, indexed by mood. Such a taxonomy of relations between the musical, linguistic and motion domains is aimed at interactive music systems and music making. For realizing the database, digital scores inclusive of lyrics are being gathered from public music collections. Music mood classification using audio and metadata will target sophisticated features but using no explicit domain-specific knowledge about a mental state.
Datasets are being realized through a cross-modal approach. Validation of the model is carried out by combining results from an online game-with-a-purpose, for Internet users, and from intermedia case studies with selected dancers. In further case studies, music works will be realized, also by invited artists, for an evaluation of the database in interactive music making. An online call for artists to use the database in music making or sound generation will be aimed at extending the evaluation further. The final database will be made available online for further exploitation. The present research will generate new knowledge for use in next-generation systems of interactive music and music emotion recognition, also contributing to extend the investigation in the broader areas of music making, computational creativity and information retrieval.