intro / software / hardware / photos / info

  On one level, the Introspection Machine is a software platform: an environment for creating dynamic compositions where the input is not a mouse or keyboard, but a live video image. The members of the Aesthetics + Computation group have currently developed twelve individual applications for this platform, each one utilizing the video information in a different way. A few of the software experiments are described below.


Flurry / Golan Levin
Particles move towards areas of high or low change in the video image. When directed to look at itself, Flurry's moving particles themselves become the agents of change.


Triway / Jared Schiffman
In Triway, each circular node decides to travel in one of three directions based on the amount of red, green, and blue components in the background color video.


Booba / Elise Co
Each Booba module expresses a range of motion from inert to manic depending on the localized signal.


Palette / Tom White
A vector field is drawn from the color information in the video signal. The underlying image is constantly redrawing itself, slowing replacing old information with updated motion data.


Console One / Casey Reas
The light and dark values from the video image control the opening and closing of graphic apertures. The colors are interpolated from a predetermined palette of hues.


Disgrand / Ben Fry
The video image is deconstructed, sorted, and then redisplayed according to light values in the original signal.



The Introspection Machine is copyright 1999-2000, Massachusetts Institute of Technology. IM was developed by the Aesthetics and Computation Group at the MIT Media Laboratory.