The goal of this project is the development of a vibrotactile synthesis framework to provide performers with information about the internal state of a live electronics system. We propose to exploit the haptic modality as an alternative channel for information display. The use of live electronics, i.e. the live processing of sound during a performance, is a common practice in contemporary music performance; artists make use of novel input devices and digital signal processing (DSP) techniques to conceive new forms of expression. An important issue in these contexts is the synchronization between instrumental performance and the live electronics. A number of approaches have been developed to facilitate their interaction: Besides score-following techniques (which are often avoided for reasons of reliability and complexity), a common technique is the use of simple onstage input devices, such as MIDI foot pedals or switches, which allow the instrumental performer to control and sync to the real-time processing by sending discrete triggers to the system.
The CIRMMT Live Electronics Framework (CLEF) is a modular system for composition and performance of live electronic music. It has been used for the realization of electroacoustic works at McGill as well as in a number of CIRMMT projects since 2009. Control parameters for DSP processes in CLEF are created during the composition process which are typically triggered by the performer on-stage using a generic input device. A common difficulty with this common practice, however, is the lack of feedback to the performer about the state of the live electronics system. This often results in a sort of “limbo” in which – for a certain amount of time – the performer may have no feedback regarding the results of their interaction. To improve this situation, different approaches have been taken, such as using on-stage visual or auditory feedback (demanding the performer to watch a screen or listen to a click-track), or having an off-stage assistant in charge of controlling the electronics. These solutions can be problematic though, since delivering additional information via the visual and auditory channels often distracts the performer from the actual musical expression. The presence of an external assistant, on the other hand, may render the performer’s interactions almost obsolete.
Our approach is then to investigate vibrotactile stimuli using appropriate combinations of haptic actuators and signals in order to provide a feedback system that can become transparent to the user, both in terms of physical obtrusiveness and cognitive load. Our system is specific to live-electronic performance practice, in that it conveys information about internal variables within CLEF (such as automation curves, algorithmic processes, analysis parameters, etc.). Leveraging existing works in the field of tactile actuators and synthesis of tactile events, we are experimenting with different mapping strategies. We are planning to validate the system qualitatively and quantitatively through surveys and experiments with performance students. An important aspect is the validation of the system in real-world performance contexts. The software will be integrated into CLEF which will serve as a prototyping environment for tactile notifications in live electronics performance.