Optical Flow and LIDAR
(Choreographed Moving Art-Form)
Initially I developed this idea of creating a world of visceral moving paint that you could feel and touch. While I knew what it would look like, I had no idea how to realise it. I approached a range of visual effects houses and supervisors. The most interesting, but most risky approach came from visual effects supervisors Nick Brooks and Joel Hynek. I believe Nick knew of a programmer, Pierre Jasmin who had done some initial work in this area and also of some original Kodak-beta trial software.
My challenge was to hone an aesthetic that would work and harness this technique. Achieving this look was further facilitated when visual effects art director Josh Rosen joined us.
Of all the technical and scientific tools developed during the production of ‘What Dreams May Come,’ the use and adaption of optical flow analysis has become one of the most influential in the film industry. Originally used for tracking missiles, it has since been adapted as a film industry standard for tracking pixels from frame to frame. This is now a fundamental technique for advanced visual effects.
Optical flow functions by creating a motion vector map for each individual pixel in a frame. A sub-pixel positional value of x and y is created for each frame, which may then be compared to the next.
In effect, this tracks the movement of every element within a composition. Some of the potential applications of this technical break-through are – tracking, motion blur, retiming and dirt removal or removal of objects within a shot. However it was the ability to track elements within the composition, like shadows and light, that interested me the most. If I could attach paint strokes to these moving elements so that they moved like live footage then I could achieve the desired effect.
Our aim was to find a way that we could use location based footage with the actors in their real environment, and then deal with the visual effects at a later stage.
Our technical process to achieve motion painting is as follows:
Initially we shot the scenes naturally, with strategic placement of orange markers to assist in the recreation of a 3D camera move later on. The actor was then removed from the footage by rotoscope and the hole filled by merging the surrounding optical flow map.
Now we could vector map the background to track the movement of the individual pixels. These pixels could effectively be tracked by vector movement, light, shadow, tone, and colour.
Simultaneous to the use of optical flow we also adapted another technology for film use. It had been used for mapping pipelines in three dimensions to insure there were no cracks or faults in the pipes. But now it was used to map the environment we filmed in.
A secondary crew scanned the location, where we filmed, using this technology known as LIDAR (a combination of laser and radar) that measures the properties of scattered light of an environment.
The LIDAR scan gathered topographical information and presented the environment as a 3D space, assisting us in recreating a believable three-dimensional environment. As an example, if you scanned a tree out on location, you could then move into any part of that tree on the computer down to fairly small branches. Using this information, you could also move around the tree in the three dimensions.
The natural footage was then combined with the LIDAR information and composited with the background matte paintings.
Now any 3D camera move could be recreated using the orange markers from the live action filming and the digital 3-dimensional LIDAR information it had been married to.
From these layers of information it was time to create the illusion of a painting in motion in three dimensions whilst still making use of the original live action footage.
The information gathered in our optical flow analysis, tracking the original footage shot gave us a complete vector map of the movement of each pixel in our sequence. We used this information to attach individual brushstrokes to the existing natural footage that would then move in-sync with our live elements. We continued to paint over the original footage until the entire sequences began to appear as if it were a painting in motion. Finally, the actor was dropped back into the sequence.
Effectively we used our initial live footage, like a painter would a pencil sketch (or draft outline), onto which we then layered our digital painting. So, using this analogy, the creative use of optical flow technology allowed us to ‘paint over this live action sketch’ and create an artwork in motion. An artwork that was complicit yet not constrained to the movement of the natural world.
We’re interested in any new developments in this area, so any comments would be most welcome in our guestbook.
SOME RECENT DEVELOPMENTS IN THE USE OF OPTICAL FLOW:
Nick Brookes tells me that he and Pierre have recently completed work on a Mini-Series called Dreamkeeper (Emmy Winner). They developed a new version of optical flow to create the Native American “Cloud World”. It’s the same idea used in the What Dreams May Come technique, but with some enhancements.
They shot stereo (single camera + stereo lens) and used a new version of Optical Flow to motion analyse the footage from both lenses. Then cast the image as pixels into 3D and added turbulence etc.
The last stage was colour correction (which is a little too intense for my taste) and combined it all back with 3D clouds and some of Pierre’s custom image filters. Nick says, “it’s nice stuff to work with and be creative. There is still a tone of interesting work to be done in this area.”
- Written by Vincent Ward, dated 20th January, 2009