Loading…
Tuesday, November 19 • 15:30 - 16:00
Poster: Embodied interaction with sound

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Our poster discusses the use of gestural body movement for interaction with music software. While interaction with music software and virtual instruments has traditionally involved using Graphical User Interfaces (GUIs) and/or hardware MIDI controllers, we discuss the unique interactions and interfaces found in recent body movement controllers. Some of these controllers have been designed with the intention of being used in a performance context to provide visualization for an audience of the performer’s actions on the music. Another goal of these controllers is to ensure a high degree of expression during their use. Some of these controllers are used to make music creation more accessible for all people. The idea of body motion based interaction with electronic instruments is far from new, with inventions such as the Theremin existing since 1920. Today, interfaces that allow the use of mid-air body gestures to control music applications continue to become available such as the MI.MU Gloves, Wave MIDI Ring, Xbox Kinect, and the Leap Motion Controller. These interfaces are often accompanied by software tools enabling the user to design their custom gestural interactions with audio parameters.

Speakers
avatar for Balandino Di Donato

Balandino Di Donato

Lecturer in Creative Computing, University of Leicester
avatar for Tim	Arterbury

Tim Arterbury

CEO, TesserAct Music Technology LLC / Baylor University
I am a graduate student pursuing a Master's degree in computer science and researching human-computer interaction with music software. My latest project, MoveMIDI, uses in-air body movements to control music software. See more at MoveMIDI.com. I do a lot of C++ and JUCE coding and I have a passion for making music... Read More →


Tuesday November 19, 2019 15:30 - 16:00 GMT
Newgate Room Puddle Dock, London EC4V 3DB