He started from a contrarian response to the basic characteristics of the computer. Instead of being logical, he wanted to be intuitive. He wanted things to be bodily engaged. The experience should be intimate.
Put the computer out into physical space. Eneter into a space with the computer.
He did a piece called Reflexions in 1983. He made an 8×8 pixel digital camera, had 30 fps, which was quite good for the time.
He made a piece that required very jerky movement. The computer could understand those movements and he made algorythms based on them. This was not accessible to other people. He internalised the computer’s requirements.
In 1987 he made a piece called Very Nervous System. He found it easier to work with amateur dancers because they don’t have a pre-defined agenda of how they want to move. This lets them find a middle space between them and the computer.
The dancer dances to the sound, and the system responds to the sound. This creates a feedback loop. The real delay is not just the framerate, but the speed of the mind. It reenforces the particular aspects of the person within the system.
The system seemed to anticipate his movements. Because consciousness lags movement by 100 milliseconds. We mentally float behind our actions.
This made him feel like time was sculptable.
He did a piece called Measure in 1991. The only sound source was ticking clock, but it was transformed based on user movement near the clock and the shape of the gallery space. He felt he was literally working with space and time.
He began to think of analysing the camera data as if it were a sound signal. Time domain anaylises could extract interesting information. Responsive sound behaviours could respond to different parts of the movement spectrum. Fast movements were high frequency and applied to one instrument. Mid speed was midrange. Slow was low freq.
With just very blocky 8×8 pixels, he’s got more responsiveness than a kinect seems to have now. Of course, this is on the computer’s terms rather than tracking each finger, etc.
There is no haptic response. This means that if you throw yourself at a virtual drum, your body has to provide counterforce, using isometric muscular tension. The virtual casts a real shadow into the interacting body.
Proprioception: How the body imagines its place within a virtual context.
This sense changes in response to an interface. Virtual spaces create an artificial state of being.
He did a piece called Dark Matter in 2010 using several cameras to track a large space, defining several “interactive zones.” He used his iphone to map a virtual scultpure that people could sonically activate by touching the virtual sculpture. He ran it in pitch dark using IR cameras to track people.
After spending time in the installation, he began to feel a physical imbalance. It felt like he was moving heavy things, but he wasn’t. It can look like having a neurological disorder to an outside observer. The performer performs to the interface, navigating an impenetrable internal space. The audience can see it as an esoteric ritual.
this was a lot like building mirrors. He got kind of tired of doing this.
To what degree should an interface be legible? If people understand that something is interactive, they spend a bunch of time trying to figure out how it works. If they can’t see the interaction, they can more directly engage the work.
The audience has expectations around traditional instruments. New interfaces create problems by removing the context in which performers and audiences communicate.
Does the audience need to know a work is interactive?
Interactivity can bring a performer to a new place, even if the audience doesn’t see the interaction. He gave an example of a threatre company using this for a sound track.
He did a piece from 1993-2000 called Silicon Remembers Carbon. Cameras looking at IR shadows of people. Video is projected onto sand. Sometimes pre-recorded shadows accompanied people. And people walking across the space shadowed the video.
If you project a convincing fake shadow, people will think it’s their shadow, and will follow it if it moves subtly.
He did a piece called Taken in 2002. It shows video of all the people that have been there before. And a camera locks on to a person’s head and shows a projection of just the head, right in the middle.
Designing interfaces now will profoundly effect future quality of life, he concludes.
Questions
External feedback loops can make the invisible visible. It helps us see ourselves.