2 03 2009


At a NASA sponsored conference on Human Systems I reminded the audience of the impact that a photo of the earthrise over a moonscape had on the perception of our planetary condition. The making of this image marked an important moment in the history of human experience. I suggested that a similar event may mark the first voyage to Mars when the blue planet fades into the background of stars before the red one becomes prominent. The sense of profound isolation may not be pleasant but should make for an interesting moment of reflection. One similar to, but I expect orders of magnitude greater than, when mariners first ventured out of sight of land. These experiences have value to our culture in that they shape our understanding of ourselves. Much of what can be learned about extreme environments will be in the form of data, measurements that we can compare to others that we have made in order to shape an understanding of the new in relation to the known. Some have suggested that extreme environments such as those found in extraterrestrial, undersea or polar environments require interrogation by robotic and remote sensing techniques rather than by human exploration and habitation. While these techniques are capable of providing representations that can be understood intellectually, they are incapable of providing a direct experience. Others argue that human beings are the most robust and versatile autonomous control systems available and must be included on missions for that reason. But beyond functionality and instrumentality, arguments that will be continuously eroded by technological innovation in any case, I argue for the irreplaceability of human presence in extreme environments on the grounds of human experience.

However, there is a contradiction here. Extreme environments, as noted by Louis Bec (2007) , do not exist a priori but depend upon the relationship between an environment and the organism in question. We count those as extreme that are hostile to life and are able to venture into them only by virtue of our technological interventions. We participate to the extent that we can remain within a protective technological bubble. These technologies reduce or eliminate the experience of the extreme conditions even as they protect the organism from it. But, can technologies be developed to open extreme environments to experience rather than shielding us from them? I believe that prototype devices have already been developed that show how this can be accomplished. Perceptual prostheses of the kind described here will enable the direct perception of hostile conditions from with in the technological womb. While humans are physiologically capable of experiencing many salient features of their terrestrial environment, this may not be the case for extreme and alien environments. These environments may require the immediate awareness of other spectra or conditions by means of technologically mediated perception. Prosthetic perception may become a key enabling technology for the habitation of extreme conditions in addition to providing the principle justification for a human presence in them.

Read the rest of this entry »


2 03 2009



This paper is intended to introduce the system, which combines “BodySuit” and “RoboticMusic,” as well as its possibilities and its uses in an artistic application. “BodySuit” refers to a gesture controller in a Data Suit type. “RoboticMusic” refers to percussion robots, which are applied to a humanoid robot type. In this paper, I will discuss their aesthetics and the concept, as well as the idea of the “Extended Body”.


The system, which I introduce in this paper contains both a gesture controller and automated mechanical instruments at the same time. In this system, the Data Suit, “BodySuit” controls the Percussion Robots, “RoboticMusic” in real time. “BodySuit” doesn’t contain a hand-held controller. A performer, for example a dancer wears a suit. Gestures are transformed into electronic signals by sensors. “RoboticMusic” contains 5 robots that play different sorts of percussion instruments. The movement of the robots is based upon the gestures of the percussionist.

Working together with “BodySuit” and “RoboticMusic,” the idea behind the system is that a human body is augmented by electronic signals in order to be able to perform musical instruments interactively. This system was originally conceived in an art project to realize a performance/musical theater composition.

This paper is intended to introduce this system as well as the possibilities from my experiences in an artistic application.

Read the rest of this entry »


1 03 2009



In 2005 an international, multi-disciplinary, inter-institutional group of researchers began a three-year research project that is attempting to use evolutionary and adaptive systems methodology (genetic algorithms, neural networks, etc…) to make an embodied robot that can exhibit creative behaviour by making marks or drawing (in the most general sense). The research is popularly known as the DrawBots Project. The research group is composed of computer and cognitive scientists, philosophers, artists, art theorists and historians. One outcome of the project will be a large-scale art installation of a group of DrawBots. Other outcomes will include the various research publications reflecting the vested interests of the group both as independent researchers and as a group.

There are a number of motivations for the project including the production of machine-created art and the exploration of whether it is possible to develop (minimally) creative artificial agents and the research has two, mutually dependent, contextual frameworks. One concerns methodologies for making an agent that has the potential for manifesting autonomous creative behaviour. The second concerns methodologies for recognising such behaviour. Another emphasis is attempting to place this work in an art historical context. Amongst the key concepts that the project is examining are: personality, autonomy, value, signature, purpose, novelty, embodiment, social context, environmental interaction, ownership and so on…

Read the rest of this entry »


28 02 2009


“Our present global crisis is more profound than any previous historical crises; hence our solutions must be equally drastic. I propose that we should adopt the plant as the organizational model for life in the 21st century, just as the computer seems to be the dominant mental/social model of the late twentieth century, and the steam engine was the guiding image of the nineteenth century.” (McKenna, 1992)

As a botanical parallel to the oft misunderstood field of HCI – Human Computer Interaction, HPI – Human Plant Interaction, explores the nature of surfaces and processes required to facilitate mutually beneficial interaction between humans and plants. HPI necessarily takes a symbiotic approach, being shaped by the questions it poses, such as; how`fr`can this two-way interface be realised? What assumptions are we making with regards to how we understand humans and plants? Do we need individual, specialised interfaces for each species, language or alkaloid, or are there more general approaches? How would they work? Where, or what is the point of contact between the humans and plants? How do we make the transition from machinic to organic? From boolean logic systems to systemic ecologic? What changes are required, and what further changes would occur in the plants, or humans using such interfaces? How does the nature of time, place and metabolic byproducts differ on each side of these interfaces? Are they reconcilable, or even mutually explicable? What can we learn from each other? How can we form a closer symbiosis and better understanding between the human and vegetable kingdoms once we open the gates between them? Communication, or pollination?

Illustration 1: Tree Woman. Drawing by Lina Kusaite, 2007

Read the rest of this entry »