The Way Things Go is an interactive video installation that allows the participant to control the characters on the screen through the input of sound.
Two microphones hang from the ceiling and prompt the passerby to interact with the screens using their voice. The screens show a boy and girl on either end of it that walk towards each other when triggered through the sound. Once they meet in the middle, one of several interactions between the two will play. The video content is triggered when a sound is made but will stop when there is no sound. In this way the participant is given control of whether and how the story will conclude.
Why did we do it?
The idea for this piece came from the 1987 video art piece of the same title by Swiss artists Peter Fischli and David Weiss, depicting a long chain of everyday objects assembled as a Rube Goldberg machine (find out more). The idea of a Rube Goldberg machine is to trigger a starting event and then give up control, letting all other things happen as a carefully planned out chain reaction. This original concept of control and chain reaction is questioned in our piece, giving the participant partial control over what happens, but not making it completely transparent how the story will play out.
How did we make it happen?
This piece consists of three Sony monitors that are all hooked up and controlled by a Max patch (that can be found here). The full list of materials:
- 3 Sony PVM Trinitron Color Video Monitors
- M-Audio Profire 610
- 2 Shure MH57 Mircrophones
- Software: Cycling '74 Max
- Colorful array of cables/connectors and adapters
Anne-Marie Lavigne (ideation, design & implementation in Max/MSP/Jitter), Alexandra Coym(ideation, design, footage shoot and animation & implementation in Max/MSP/Jitter) and Wen Lei Ng (ideation, design, implementation in Max/MSP/Jitter)