Motion Tracking the dancers

Before I’d had a chance to book out expensive equipment, I crafted a makeshift rig to experiment with motion tracking the dancers. This consisted of a front and side-view camera to record the stage, and red ribbon tied to the dancers leg and arm, which would be tracked through after effects. In the footage, I’ve isolated the red ribbon to make it easier to motion track. Because there are 2 video feeds, there are 2 X and Y values, they can be combined to generate a simulation of X, Y, and Z value in After Effects. Though slightly inaccurate, it manages to capture the 3 dimensional movement of the band. These values can of course also be applied to other to other factors of animation.

 

A better way of capturing and importing this movement is with a Kinect. The video belowexplains how a couple created a tool for creating 2.5D skeletal meshes in after effects, from Kinect input. If we capture the dancer this way, there would be a simple means of generating animation linked to dance movements.

Responding to the Dancer’s responce

particle follow2

Today we aimed on building animations that were more responsive with the dancer’s movement. In After Effects, using video of Zoe performing a sequence of repetitive movement,  I’ve used motion tracking, along with a path, to create a sprite that moves with accordance to the dance movements.  The path is at first based on her hand movements. Then, as she throws it, the particles fly off and follows its own path.

 Because After Effects uses JavaScript, it is possible to pick-whip these values and simply apply them to values of other effects and presets, such as size, opacity, and much more. Recently, I’ve been experimenting with creating animation with values from the audio amplitude in music. The videos below are example of this, and are explained more below. Expanding on the capabilities of motion tracking to create animation, the motion tracking in After Effects provides the X and Y positions as a numerical values, meaning they could be used to create a more abstract animation that is still in accordance to the movement of the dancer. Projected onto a stage, this would be a choreographed performance of digital deconstruction and reconstruction of human form, the true organic counterparts, and their response to one another.

 

 

 

 

 

 

 

This was an example i gave to the dancers. I used it to explain how I’d turned audio

amplitude into key-frames with numerical values. Here, those values were applied

to the scale and opacity of a ring.

 

As a more advanced example, I provided a rendering of particle layers. Each layer

has its own unique values such on velocity, longevity, opacity, and so on.  The result

provided richer animation and variety of motions that  feel very organic and in tune

with the music.

 

 

The Duet of Dance and Dynamic Projections

Today was the first day Shane and I met Zoe, the director, and the dance performers. In this meeting, we established that the performance we were going to create would be a 20-minute duet of dance and dynamic projections. The movements of the dancers would influence and be influenced by the projections. Zoe also wanted the performance to be based on the idea of escapism.

 

Projector Test (1) (1)I showed them the prototypes and examples of animation that I had made prior to this meeting. These concept animations were made to give clear examples of what I could create to Zoe and the dancer. They proceeded to produced sequences of movement inspired from the animations, while Shane and I projected them onto a screen. The director gave feedback, and this allowed us to begin experimenting with animations we would later test with.

 

 

 

 

 

 

Prototype Stage (5)The stage we have decided to work with is a triangular in-the-round, with 3 screens surrounding it. Shane stated to formulated and experiment with small-scale prototypes of this. Different models provided ideas for the space it would require for front or back projections, and type of projectors needed to achieve this in large-scale. It also gave basis for modelling the projection mapping; how the animations would work in accordance to the structure of the stage, in practical and artistic terms.

 

 

 

 

 

Stage Plan 6Digitally adapting the performance is something me and Shane discussed. As the stage is in-the-round, ordinarily the audience would watch from the outside-in, however, introduction of a 360-degree camera placed above the centre of the dancers, but lower than the screens, would allow the audience to watch from the inside-out. This would be ideally be viewed through a VR headset, placing the audience in the centre of the action. I think this could be used to broaden Zoe’s theme of escapism. Shane suggested live-streaming the performance on Facebook. Facebook has recently introduced the ability to view and stream 360-degree video. Wider audiences could view the performance live on Facebook and look around it. This could be especially accessible, yet effective through a smartphone or tablet.