Still Life

In 2011, as part of Hack.Art.Lab, I collaborated with composer Mary Ellen Childs and percussionist Michael Holland to create live animation triggered by live performance of Mary Ellen Childs’ composition “Still Life.”

Still Life performance and discussion
Video courtesy of Wichita State University Media Resources Center

We analyzed the piece into 11 sections and created algorithmic video triggered by sound and motion to match each of the 11 sections. The video was projected in front of the players on a semi-transparent scrim.

The algorithms generating the video were statistical as opposed to deterministic. In addition, the performers themselves by nature will never perform the work identically between two performances. These two aspects guarantee that every performance of the piece will generate a unique and unrepeatable set of video during the performance.

We attached infrared LEDs to the performers’ percussion sticks so they could be tracked in real-time by 3 modified video cameras. From a high level I controlled the animation during the performance by tracking the section change and pressing the appropriate button on my custom software.

Similar Posts

  • DoubleTalk

    Doubletalk, a two player audio-manipulation game was my first serious endeaver with the Gameboy. The game used the Pocketvoice, a Gameboy cartridge with a built-in amplified speaker and microphone. In Doubletalk, players record themselves, reverse their recordings, then try to guess what each other is saying.

  • Microphone with proximity detection

    Around 2004 I developed a few protoype microphones enhanced to also offer proximity detection. The microphone could adjust it’s amplitude and bass response based on the proximity of the person using it. This would lessen the variable results users experience when holding a microphone too close or too far. Moreover, with proximity or its derivative mapped to a combination…

  • Digital Puppetry

    I worked with a team of colleagues, community members, and urban youth. Our intention was to help the youth learn in a playful environment, find personal self-expression, and have their voices heard by communities in Boston. To do this, we adapted commercially available technology to provide a unique medium: digital puppetry.

  • LegalLanguage

    I wrote LegalLanguage, a scripting language for lawyers at Legal Services Corporation in West Virginia. The staff used LegalLanguage to write simple scripts that could then ask clients questions, give guidance, and print out the appropriate forms. This freed up resources to focus on the large number of cases involving domestic violence.

  • Ballet Wichita: Innovations

    As part of the Ballet Wichita’s Innovations performance April 2024 I teamed up with Wichita State University’s Shocker Studios to track a dancer live, reinterpreting their motions on a stage-sized display behind them. Wichita State University’s Fairmount String Trio accompanied the dance, performing Dohnányi’s Serenade. To track the dancer we used 3 Vive 3.0 trackers…

  • Cybergarden

    As part of Wichita’s Open Studios project, TechArtICT opened Cybergarden at Towne West Mall. Cybergarden was “a magical space…a mysterious metaverse.” It featured work from various artists and creatives in the greater Wichita area. TechArtICT was founded by me and many of my works were featured in the installation, including Ghost in the Machine and…