Machine listening and learning for musical systems

  • Location:
    Pompeu Fabra University (Room 52.321, 3rd floor), Roc Boronat, 158., Barcelona, Barcelona, ES

Musical articial intelligences are playing an important role in new composition and performance systems. Critical to enhanced capabilities for such machine musicians will be listening facilities modeling human audition, and machine learning able to match the minimum 10000 hours or ten years of intensive practice of expert human musicians. Future musical agents will cope across multiple rehearsals and concert tours, or gather multiple commissions, potentially working over long musical lifetimes; they may be virtuoso performers and composers attributed in their own right, or powerful musical companions and assistants to human musicians.

In this presentation we’ll meet a number of projects related to these themes. The concert system LL will be introduced, an experiment in listening and learning applied in works for drummer and computer, and electric violin and computer. Autocousmatic will be presented, an algorithmic composer for electroacoustic music which incorporates machine listening in its critic module. Large corpus content analysis work in music information retrieval shows great promise when adapted to concert systems and automated composers, and the SuperCollider library SCMIR will be demonstrated, alongside a new realtime polyphonic pitch tracker.

Nick Collins is a composer, performer and researcher in the field of computer music. He lectures at the University of Sussex, running the music informatics degree programmes and research group. Research interests include machine listening, interactive and generative music, and audiovisual performance. He co-edited the Cambridge Companion to Electronic Music (Cambridge University Press 2007) and The SuperCollider Book (MIT Press, 2011) and wrote the Introduction to Computer Music (Wiley 2009). iPhone apps include RISCy, TOPLAPapp, Concat, BBCut and PhotoNoise for iPad.