Clothing Conundrum

When projected onto light coloured clothing or skin, images are reflected relatively fine. However, when projected onto it, black material absorbs the light. This occurrence means the dancers will have to wear light coloured clothing if they want the images to be visible when projected onto them.

Another theoretical solution to this would be projectors with more lumens. This wound allow the dancers to wear darker clothes (as the dance choreographer originally intended). However, with the current resources, this is not an option. Three projectors of that calibre would be very costly.

 

Under the right circumstances, it is possible to project a perfect image onto a black black background. You just need the right material:

Dubbed Black Diamond, this material reflects 85% of light, opposed to just the 10-15% from a traditional white screen. Of course, the company that produce this aren’t open about the technology they use in the material to produce this. But from my (limited and assumptious) understanding, this material is essentially subtly shiny. This means we can see the image clearer because it reflects more light.

Camera experiments

Today Shane and I booked out a 360-degree camera from media loans… Or at least we thought we did. The camera we wanted could record in 360-degree video, the camera we got could move all around on a 360-degree axis (as opposed to viewing all the angles at once.

The Osmo 360 camera is truly a marvel in comparison to most though. The camera is a steady-cam shoots upto 4K, and has panoramic photography, but not panoramic video. It can live stream directly to Facebook or YouTube, and can be controlled with a smartphone.

For our installation, this camera may provide good footage, but it doesn’t provide the panoramic video that we could use for VR.  This makes it useful for documentation, but it is not ideal for the output we desire. However, as it has the ability to rotate and pan 360-degrees, it would allow the controller of the camera to explore the performance space from the cameras point of view

 

image

Extending the animation

This simulation works by using a composition that is the same size as the 3 screens combined. Referencing 3 cameras in this composition, which each cover 1/3 of the screen, the animation can extend across the screens. Well, it’s a little bit like that, but a lot more complicated.

When composing the actual animation, it will be easy to practically produce and render each screen output using this model. The simulation I’ve made provides sources an outputs which make it easy to produce content for this set up.

triple screen (0;00;10;04)

Stage Simulation

 

In order to get a clearer visualisation when designing animation, I decided to craft a 3-dimensional representation of the stage for the projection. This is built within after effects. A null object controls the cameras 3D path. The space is created from layers in a 3D space. Each screen has it’s own composition, so that footage can be implemented easily. This will provide a reasonable simulation that will be used to help design the animations.