|
The
goal of the Cinema Fabriqué system is to create a software environment,
complementary wearable devices, and a usage methodology for producing
engaging cinematic experiences in realtime for live audiences through
natural language control. Current multimedia performance packages
suffer from input bandwith bottlenecks that restrict the scope of
user control and audience engagement. My proposed alternative aims
to couple a high degree of user control through gestural and speech
input with intelligent software to create rich audiovisual output.
|
|
|
Traditional
GUI interfaces coupled with standard computer input peripherals are
a poor match for live performance. Typically the GUI is hidden from
the audience and parameters of the output are adjusted one or maybe
two at a time with mouse and keyboard. Also, The performer's physical
actions do not map to the sound and video of the system. By adding
natural forms of input such as gesture and speech recognition, the
performer can simultaneously adjust many degrees of freedom and search
databases verbally in a manner that will be engaging for a live audience
to witness. |
Inspiration
for this project came from the gestural interface used in the movie
Minority Report. If what I'm saying above makes no sense, just
see the movie and watch Tom Cruise wave his hands and edit video in
real time. Props to John Underkoffler for his work on that flick.
I have not finished work on the gesture and speech recognition agents
yet, but I have made a wireless camera/joystick controller to allow
distant control of the system. Below are a bunch of sample clips and
pics made in realtime. |
|
|