Workshop for AIMC 2024 Conference
Hand-on workshop on PRiSM's Music Gesture Recognition tool
This hands-on workshop provides an immersive introduction to the PRiSM Music Gesture Recognition (MGR) tool, a state-of-the-art platform designed for the instantaneous recognition of musical gestures from live audio inputs. Here, 'musical gestures' are defined as a class of short musical/compositional material with shared pitch, rhythm, and timbre characteristics. PRiSM MGR tool originated from the extension of Professor George Lewis's Voyager recogniser, which presents an exciting avenue to enhance real-time machine listening and foster creative applications in music. The success of "Forager" (Lewis, 2022), stands as an example of the potential of MGR. Utilising machine learning, the MGR tool facilitates the creation and augmentation of datasets, allowing for the classification of these musical gestures.
In the workshop, participants will engage with the PRiSM MGR software in a practical setting, learning to create and expand their own datasets of musical gestures. The workshop will provide a comprehensive understanding of how these datasets are enhanced and utilised to train machine learning models, ensuring a nuanced classification of musical gestures. In hands-on sessions, attendees will create their gesture libraries, iterating through the cycle of training, validating, and refining the accuracy of machine learning models. This direct interaction with the tool will allow the participants to integrate these skills into dynamic music performance and compositional practices, and gain insights into the seamless integration of AI with musical creativity.
Contextual Review of Practice Field: Gesture recognition in music represents a burgeoning field that intertwines music performance, composition, and AI technology. PRiSM MGR tool stands at this intersection, offering a unique platform for musicians and composers to explore and implement gesture-based controls in real-time music performance settings.
Workshop Methodology: Participants will start with an introduction to the software, followed by a practical session on recording and creating their own gesture samples. They will then train a gesture recognition model and implement it in a live musical context. The workshop will balance between instruction, demonstration, and hands-on practice, ensuring participants have the opportunity to engage directly with the technology. Additionally, The workshop will be structured to cater to a diverse range of skill levels, and attendees will leave with a foundational understanding of how gesture recognition can be creatively applied in the realm of music AI.
The hands-on nature: The hands-on nature of this workshop will be emphasised through interactive, participatory sessions that are distinct from traditional presentations. Participants will be directly involved in the following:
Active Learning: Rather than passively receiving information, attendees will actively engage with the PRiSM MGR tool and receive real-time feedback.
Live Demonstrations: The contributor will provide live demonstrations of musical gesture dataset creation, model training and production processes, followed by participant practice sessions.
Practical Application: Time will be allotted for participants to apply learned skills to their own musical projects, with the contributor providing individualised support.
This hands-on workshop provides an immersive introduction to the PRiSM Music Gesture Recognition (MGR) tool, a state-of-the-art platform designed for the instantaneous recognition of musical gestures from live audio inputs. Here, 'musical gestures' are defined as a class of short musical/compositional material with shared pitch, rhythm, and timbre characteristics. Utilising machine learning, the MGR tool facilitates the creation and augmentation of datasets, allowing for the detailed classification of these musical nuances. Participants will gain hands-on experience in creating personalised gesture libraries, training robust machine learning models, and integrating these skills into dynamic music performance and compositional practices.
Hongshuo Fan, the Centre for Practice & Research in Science & Music (PRiSM), the Royal Northern College of Music (RNCM)
Bio: Hongshuo Fan 范弘硕 is a Chinese cross-disciplinary composer, new media artist, and creative programmer. He has extensive experience in creating real-time interactive multimedia content, including acoustic instruments, live electronics, generative visuals, light, and body movements and is particularly interested in the fusion of traditional culture and cutting-edge technology. Hongshuo holds a PhD in Electroacoustic Composition from the University of Manchester and is the research software engineer at RNCM PRiSM.
The workshop is proposed as a half-day session, approximately 4 hours in length. This includes setting up, introductory presentations, hands-on activities, breaks, and a final Q&A segment.
Technical Equipment Provided:
A laptop with the PRiSM MGR software pre-installed.
The PRiSM MGR tool’s open-source GitHub page, including the source code and compiled version of the tool (for MacOS Ventura 13).
USB-C Digital AV Multiport Adapter.
Technical Equipment Required:
A PA system, suitable for the venue.
Projector or screen for demonstrations.
Participants will need to bring their laptops.
Participants are encouraged to bring their own instruments and headphones for personal use.
Setup Details:
A small table, chair, and power outlets for each participant.
High-speed Wi-Fi connection.
Quiet environment to facilitate clear audio capture.
https://github.com/rncm-prism/PRiSM-MusicGestureRecognition
The organisers of the "Hands-on Workshop on PRiSM's Music Gesture Recognition Tool" are committed to upholding the highest ethical standards in accordance with the guiding principles of accessibility, inclusion, and sustainability. The software has been open-sourced to ensure broad access to our workshop materials. We are dedicated to maintaining data privacy and addressing socio-economic disparities by offering this educational opportunity free of charge or at a reduced cost to eligible applicants.
Regarding the use of human data in the PRiSM MGR tool, we adhere to strict protocols for informed consent and ethical participant selection. The workshop does not involve any harmful procedures, and participants are free to withdraw at any point. We acknowledge financial and non-financial contributions that have made this workshop possible, ensuring that all such engagements are ethical and do not compromise the integrity of the research and learning outcomes.