By SUGURU GOTO
ABSTRACT
This paper is intended to introduce the system, which combines “BodySuit” and “RoboticMusic,” as well as its possibilities and its uses in an artistic application. “BodySuit” refers to a gesture controller in a Data Suit type. “RoboticMusic” refers to percussion robots, which are applied to a humanoid robot type. In this paper, I will discuss their aesthetics and the concept, as well as the idea of the “Extended Body”.
INTRODUCTION
The system, which I introduce in this paper contains both a gesture controller and automated mechanical instruments at the same time. In this system, the Data Suit, “BodySuit” controls the Percussion Robots, “RoboticMusic” in real time. “BodySuit” doesn’t contain a hand-held controller. A performer, for example a dancer wears a suit. Gestures are transformed into electronic signals by sensors. “RoboticMusic” contains 5 robots that play different sorts of percussion instruments. The movement of the robots is based upon the gestures of the percussionist.
Working together with “BodySuit” and “RoboticMusic,” the idea behind the system is that a human body is augmented by electronic signals in order to be able to perform musical instruments interactively. This system was originally conceived in an art project to realize a performance/musical theater composition.
This paper is intended to introduce this system as well as the possibilities from my experiences in an artistic application.