Ballet Wichita: Innovations

As part of the Ballet Wichita’s Innovations performance April 2024 I teamed up with Wichita State University’s Shocker Studios to track a dancer live, reinterpreting their motions on a stage-sized display behind them. Wichita State University’s Fairmount String Trio accompanied the dance, performing Dohnányi’s Serenade.

To track the dancer we used 3 Vive 3.0 trackers paired with 4 Vive 2.0 base stations. We then interfaced Unreal Engine 5.3 to the trackers and mapped them to Unreal’s Niagara particle system.

Danni Varenhorst dances as part of Innovations

Brian Foster did the bulk of the heavy lifting building the Niagara system in Unreal Engine. After the project ended, he further developed the technical aspects of this work by tracking dancers in Shocker Studios’ motion capture lab and expanding his use of Unreal’s Niagara particle system.

Read more about Innovations

Similar Posts

  • Digital Puppetry

    I worked with a team of colleagues, community members, and urban youth. Our intention was to help the youth learn in a playful environment, find personal self-expression, and have their voices heard by communities in Boston. To do this, we adapted commercially available technology to provide a unique medium: digital puppetry.

  • Contrapuntal Composer

    Contrapuntal Composer is Prolog code which writes music for three simultaneous voices. Depending on initial parameters, it can write a fugue, a rondo, or any other contrapuntal form. Contrapuntal Composer obeys the rules of good voice leading within each voice and between the voices.

  • SoundBlocks

    SoundBlocks is a tangible environment where youth connect blocks to describe network dataflow. The environment explores digital sound manipulation as a personal, meaningful and fun artistic endeavor, rather than as a venture into mathematical, electronic or networking relationships.

  • Still Life

    In 2011, as part of Hack.Art.Lab, I collaborated with composer Mary Ellen Childs and percussionist Michael Holland to create live animation triggered by live performance of Mary Ellen Childs’ composition “Still Life.” We analyzed the piece into 11 sections and created algorithmic video triggered by sound and motion to match each of the 11 sections. The video was projected…