BodyVox – Deep Wading

BodyVox, Portland’s premier modern dance company, started their 20th anniversary season with a show called Lexicon. It was a show exploring the intersection of dance and technology. I worked with Jamey Hampton, co-founder of BodyVox, to create a dance number called “Deep Wading”. We used a 3D camera, an XBox Kinect to track the dancers. And we used a high-end graphics computer to generate graphics effects behind the dancers with the effects reacting to the dancers motion on stage.

Like many other software projects, we had a short deadline (only a couple months!), and the deadline was fixed. The show must go on! In out first meeting we had a difficult time zeroing in on what effects to do. Given the short schedule, I suggest that I go hide in my office and make up a bunch of effects. Then we could get together again and use what I put together as something concrete to talk about. I did this and created a core of four graphic backgrounds. And unknown to me at the time, I made a psychedelic “bonus” debug effect (more about this below). We then talked about each one: What did Jamey like, not like, what would be better?

The Background Effects:

Water Drops.A torrential downpour hits the dancers. Lots of drops bounce off the outline of the dancers.Countless water drops come from an offscreen firehose. The water drops hit the dancers and bounce off.

The first effect I made was a water drop simulation. There were many parameters that controlled the appearance of the effect. The number of particles, where they started from, the shape of the source (hose, the sky), and how fast they fall could all be set. Also, how much they bounced off a dancer could be controlled. With this one effect many different scenes could be created, including rain (light, heavy), a firehose, or a light snow.

Spirit Particles.

The background effect show countless particles. Green show where the dancers are, red where they have been.

The second effect generated a myriad of particles that would continually move around, always attempting to stay inside the shape of the dancers. Lots of settings were available to change the look of this effect including colors and particle counts. Adjusting the speeds the particles when inside and outside of the dancer’s shape had a dramatic impact on the look. The settings we chose for the scene shown here gave the impression of green animating spirits and red ghostly trails showing where the dancers had been.

Spider Web.A dense spider web is shown behind the dancers. The web moves as the dancers move.

The third effect was a wiggling spider web. The parameters for this effect was used to make two different scenes. The first showed a billowing curtain disturbed by a breeze from a passing shape (shown). The second was a sticky web that jiggled as the dancers moved about. The settings adjusted web density, the force a dancer had to disturb the web, how rigidly the web was stitched together, and how forcefully the web tried to restore its shape.

Image Breakup.

An image is broken up into many small tiles. The tiles get pushed around as the dancers move.The fourth effect showed an image that could be broken up by the dancers, with the pieces of the image pushed around by motion. The controls for this effect adjusted the size of each block of the image, how forcefully motion pushed the blocks around, and how quickly the blocks would return home. The scene we made used relatively large blocks that were strongly pushed around by the dancers. Other possibilities included tiny pixel-sized blocks that ooze around slowly, giving more of a quicksand effect.

The Psychedelic “Bonus” Debug Effect.

The final fifth effect was never intended to be an effect. Early on while developing the software that read and processed Kinect data, I wrote some code to help me verify things were working correctly. I made a special mode that colored the entire scene, where the colors told me precise distances from each object to the Kinect camera. The colors would make big changes every meter, moderate changes every 10 centimeters, and minor changes every centimeter. At one point in the BodyVox theater, when trying out the effects with different sets of control settings, something went wrong. I turned on this debug mode to help isolate the problem, and Jamey said “Oh! That’s cool! We have to have that!”

The Timeline.

Finally, the last part of the project was to make a timeline editor. The performance was naturally done to music. To have the greatest impact these effects had to be placed on a timeline that had precise, down-to-the-second control. The timeline sets when each effect is shown, how fast they fade in and out, and how the effect parameters change over time. Using a timeline this way made the background effects perfectly repeatable and synchronized precisely with key events in the music.

Technically, what is going on?

A large portion of the code to render the effects shown are done in OpenGL shaders. Shaders are custom programs that are loaded onto the PC’s graphics card. For each effect I made, three different shaders were used. The first did physics calculations, simulating the effect and how the dancers interacted with it. The second computed where each part of the effect was located. And the third filled in the colors of every dot that comprised the effect.

For this project, I used an nVidia GTX 1080 TI graphics card. It’s a card used primarily by high-end gamers. It contains thousands of processors allowing my application to simulate the effects, draw them, and do lots of filtering on the 3D camera data coming from the Kinect. This card was fast enough to do physics calculations on millions of particles in real-time.

As an effect ran, the interaction with the dancers had to be computed. to do this, the application first read data from Kinect, where new data arrives 30 times per second. Each frame of data is basically a “distance image” where each pixel in the frame represents the distance from the camera to the objects in the scene. The application sends each “distance image” frame to the graphics card, which then filters the image for noise. The Kinect generates a lot of noisy data, probably on par for a device costing only about $80 retail. Additional filtering was done to remove physical objects in view of the camera from the simulation. These objects included the floor, the girders and equipment over the stage, and the curtains.

Finally, after the Kinect data is all cleaned up and filtered, and an effect was activated by the timeline, the processors on the graphics card got busy doing physics, bouncing particles off of dancers, doing 3D calculations to place pieces of the effect appropriately on the screen, and finally drawing millions of pixels for each frame.

Summary.

This project was a great proof of concept. It showed that very striking real-time background effects could be created quickly used to greatly enhance a modern dance performance. Of course, we only scratched the surface of what could be done. One could image expanding the coverage of effects, like drawing puddles on the floor that splashed when walked over. Or using simulated creatures in the scene like butterflies flitting around the dancers, or sinister ghosts. The backgrounds could even be 3D room interiors or the faces of buildings scrolling by as the performers appeared to walk down the street. Another possibility would be to draw effects directly on the costumes of the dancers.

The only limitation is imagination. After that it’s a simple matter of software.

BodyVox Lexicon in the News:

Leave a Reply

Your email address will not be published. Required fields are marked *