Workshop on Gesture Recognition and Machine Learning for Real-time Musical Interaction


This three-hour workshop, presented by Nicholas Gillian and Rebecca Fiebrink, aims to provide a hands-on guide to the real-time application of machine learning and gesture recognition for musical interaction. Participants for this workshop do not require any prior knowledge of machine learning or gesture recognition, as the fundamentals of these topics will be reviewed in the opening section of the workshop.  This workshop is intended to be as hands-on as possible, providing its participants the opportunity, using free open-source software, to experience for themselves the power of a number of machine learning algorithms.  Participants will be encouraged to bring along their own sensor devices, laptops and audio software and get the opportunity to train, test and experiment with a number of machine learning algorithms for real-time gestural control of music.  The content of the workshop will be focused on providing a high-level description of the core concepts of machine learning, giving participants the practical knowledge and essential skills they require to get the most of any machine learning algorithm, without requiring deep knowledge of underlying machine learning theory. 

See Also

About the Workshop

Nicholas Gillian is currently a postdoctoral fellow and Fulbright scholar working in the Responsive Environments Group at the Media Lab at MIT, where he researches the application of machine learning for the real-time recognition of musical gestures. His special interest is in how machine learning classification algorithms can be applied to enable a musician to interact with a computer using the same natural, expressive and aesthetic gestures that musicians commonly use to interact with other performers.

Nicholas completed his Ph.D., entitled Gesture Recognition for Musician Computer Interaction, in 2011 under the supervision of Sile O’Modhrain and R. Benjamin Knapp at the Sonic Arts Research Centre (SARC), Queen's University Belfast. Prior to working at MIT, Nicholas briefly worked on the EU research project SIEMPRE (Social Interaction and Entrainment using Music PeRformance Experimentation), designing multimodal synchronization protocols and software that could be used for the analysis of musical gestures. Nicholas also worked, during his graduate studies, on the EU research project SKILLS, where he developed computational algorithms that could be used to automatically classify a user's skill.

Nicholas is the developer of the SEC, a free machine learning toolkit that is packaged as an extension to the open source graphical development environment EyesWeb. The SEC enables users to easily create their own gesture recognition applications, using a large number of sophisticated machine learning algorithms, without having to write a single line of code.

Rebecca Fiebrink is an assistant professor of Computer Science (also Music) at Princeton University, where she creates and studies new technologies for music creation and performance.  Her interests span music information retrieval, creating new computer music performance technologies, the design of new hardware and software musical interfaces, and creative human-computer interaction. In 2008, Rebecca created the Wekinator system for real-time interactive machine learning, which has since been used by a variety of researchers, professional musicians, and students to create gesturally-controlled musical instruments, performances, and installations. Her work with the Wekinator formed the basis for her PhD thesis, Real-time Human Interaction with Supervised Learning Algorithms for Music Composition and Performance, which she completed in 2010.

Rebecca is a Co-Director of the Princeton Laptop Orchestra, a member of the Sideband laptop ensemble, and a “computerless” performer of the flute and piano. She frequently collaborates with composers, artists, and musicians at Princeton and elsewhere on digital media projects. She has also worked on music research and software at companies including Microsoft Research, Sun Microsystems, Imagine Research, and Smule, Inc.

How to Participate

This workshop is targeted towards the general NIME community.  The workshop will be restricted to a maximum number of 30 participants.  Participants do not require any prior knowledge of machine learning or gesture recognition, as the fundamentals of this area will be reviewed in the opening section of the workshop.  This workshop is intended to be as hands-on as possible, providing its participants with practical experience in building machine learning-based gestural controllers and instruments. Participants will learn specifically to use the Wekinator and SEC software in the hands-on component, but their experiences learning about and exploring standard algorithms will also benefit participants who plan to use other tools in their work.

We anticipate that the workshop will be of interest to the following NIME attendees, among others: composers who are interested in creating interactive performances using commodity hardware such as Kinect; instrument builders who want to explore new, example-driven paradigms for creating gesture-sound mappings; educators who are seeking tools to help their students build new interactive systems without the need for extensive programming expertise; and students or others who are interested in learning more about machine learning for gesture analysis, and who desire practical and music-specific experience to complement the training that is offered by standard machine learning coursework and textbooks.

Target Audience

  1. 1.Register online for NIME and for the workshop.

  2. 2.Check back here in May for (free) software to install on your laptop to use at the workshop.

  3. 3.Bring yourself and your laptop, and any of your own controllers and/or OSC-enabled software to experiment with at the workshop.

Organizer Bios

Read the original workshop proposal here.

Contact information:,

More Information

New! workshop slides: Available here

Wekinator materials: Available here

EyesWeb materials: Available here

ChucK: Download ChucK:

For windows
For OS X

Download Workshop Materials