Control Stategies for a Human-Conducted Quadcopter Ballet

Participants: Ian Hattwick, Michael Di Perna, Michael El-Jiz
Marcelo M. Wanderley, Luis Rodriguez (supervisors)

@caption@
Funding: CIRMMT

Project Type: Student Grant Project

Time Period: June 2014 - May 2015 Status: Completed


}}

Project Description

Gestures have been explored in the realm of sound synthesis and sound control. They have also been used for the advancement of robotic manipulation. This project explored mapping real-time music and gestures to quadcoptor movements and trajectories, to create our interpretation of a human-conducted robotic ballet. Our research goals were: 1) Identifying specific choreographic elements which are appropriate to the musical genre. What kinds of movements can be drawn from choreography for human performers? In what ways will the demands of quadcopter choreography differ from quadcopter movements for other purposes? Perhaps these differing demands may highlight strengths or weaknesses of existing control algorithms, or even suggest new ones. 2) Identifying and mapping useful human gestures to quadcopter choreography. Which parameterized representations of human musical performance gestures are useful as input to quadcopter choreographic control? Does it make sense for the same gestural representations to control both sound and choreography, or do the two output modalities require different representations? How can higher-level representations such as energy level and mood be mapped into quadcopter choreography? What kind of input parameters will the quadcopter control algorithms expect? 3) Investigate more advanced control and modeling techniques to allow for the quadcopter to react in a desired way to both musical and gestural inputs. These would include system identification to achieve an accurate model for control, as well as optimal control techniques to achieve desired trajectories/positions.


Publications


More Information