Back To Schedule
Tuesday, November 19 • 12:30 - 14:00
Demo: MoveMIDI - 3D positional movement interaction with user-defined, virtual interface for music software

Log in to save this to your schedule, view media, leave feedback and see who's attending!

MoveMIDI Description
MoveMIDI is a piece of software that allows the user to create digital music using body movements. MoveMIDI communicates with 3D tracking hardware such as the PlayStation Move to translate users’ positional body movements into MIDI Messages that can be interpreted by music creation software like Ableton Live. With MoveMIDI, a user could “air-drum” their drum samples, or they could sweep a filter by sweeping their arm across their body. By using MoveMIDI in a performance context, electronic music performers can convey their actions to an audience through large body movements. Dancers could use MoveMIDI to create music through dance. Since many physical acoustic instruments require spatial movement of they body to play the instrument, the spatial interaction promoted by the MoveMIDI may help users interact with music software similarly to how they interact with acoustic instruments, leveraging preexisting knowledge. This spatial familiarity may also help an audience interpret a performer’s actions.

MoveMIDI allows the user to construct and interact with a virtual 3D instrument interface which can be played by moving the body relative to the virtual interface elements. This virtual interface can be customized by the user in layout, size, and functionality. The current implementation uses a computer screen to display the virtual 3D interface to the user while visualization via head mounted display is in development.

MoveMIDI software won the 2018 JUCE Award and was published as a “Late Breaking Work” paper/poster and interactive demonstration at the ACM CHI 2019 Conference in Glasgow.

See MoveMIDI.com for more information.

Demonstration Outline
The demonstration begins with a 1-2 minute explanation of MoveMIDI. Next, MoveMIDI’s 2 main modes are demonstrated: Hit Mode and Morph Mode. Hit Mode allows a user to hit virtual, spherical trigger points called Hit Zones. When Hit Zones are hit, they tigger musical notes or samples by sending MIDI Message signals. Morph Mode allows a user to manipulate many timbral characteristics of audio simultaneously by moving their arms within a predefined 3D Morph Zone. Movements in this zone send different MIDI Control Change messages per 3D axis. Next, a 1 minute performance using MoveMIDI is given. Finally, audience members are invited to voluntarily try MoveMIDI for themselves. A volunteer will be given the handheld controllers and may experiment and create music. This demonstration process will repeat as attendees move to further demonstrations.

avatar for Tim	Arterbury

Tim Arterbury

CEO, TesserAct Music Technology LLC / Baylor University
I am a graduate student pursuing a Master's degree in computer science and researching human-computer interaction with music software. My latest project, MoveMIDI, uses in-air body movements to control music software. See more at MoveMIDI.com. I do a lot of C++ and JUCE coding and I have a passion for making music... Read More →

Tuesday November 19, 2019 12:30 - 14:00
Newgate Room Puddle Dock, London EC4V 3DB

Attendees (14)