View the project »
View the source code »

At the most recent POV Hackathon, I had the opportunity to help build a prototype for Camp Century, a multimedia documentary by Anrick Bregman and Nicole Paglia about Project Iceworm, a secret American military project from the 1960s in which an arctic climate science research outpost was built as a cover for a nuclear launch facility. I was teamed up with Bregman, who is an interactive/film director, and front-end developer Luigi De Rosa. We had two days to conceive, design and build a working, interactive prototype to be presented and published. This is how we did it and some new technology you can take away from it.

Story and Design

We had two days to create a prototype and presentation, so we limited our design goals to accommodate the time and resources at hand. The project concept included many documents, but we didn’t have the time to go through them, nor the equipment and personnel for recording video or audio. Anrick decided that it would be enough to tease the story and to introduce the arctic setting, so we worked only with still photos as our media assets.

The arctic location of Camp Century is harsh and dramatic, and we thought it warranted a more immersive experience than a simple slideshow. We used real-time image processing to apply various effects to the photos in response to the user’s mouse movements — a flickering light bulb, animated fog and lateral tracking with simulated depth.

Technology

We used Seriously.js for most of our image processing, since it has a number of effects that served our purposes without modification. The flickering light bulb was done by adjusting the exposure of the image up and down about a fifth of a stop. We used simplex noise, a variation on the Perlin noise algorithm, to drive the natural-looking timing of the flicker. To save time, we applied the exposure adjustment to the whole image, but the effect could be improved with an image mask so that only the areas facing the light source would be affected.

One of the biggest creative challenges was to fine-tune the intensity of each effect to be noticeable without distracting from the content of the image or the text overlay. The mouse movement seemed like an appropriate way to trigger the effects, because it’s an interaction that users would already employ intuitively to navigate around the site, so there is no need to provide text instructions.

A scene from Camp Century and the corresponding displacement map.

The interactive documentary Camp Century uses grayscale “displacement maps” to create depth in static images.

For the lateral tracking, we wanted to do better than a basic pan-and-scan, which lacks parallax – the way nearby objects appear to move more than objects farther away. We were able to achieve this with a displacement map, a grayscale image in which the brightness of each pixel determines how far that pixel will be moved as the shot tracks left or right. Pixels with bright corresponding values in the displacement map will move a lot, and those with dark displacement values will move very little or not at all. This technique has been applied to convert 2D films to 3D, with displacements applied to generate different images for the left and right eyes.

There are also ways to acquire images with true depth information — one can use a depth camera, such as the Kinect, or by using computer vision like the Android camera’s artificial lens blur. But those techniques require access to the original location, so we had to paint our displacement maps by hand.

View the project »
View the source code »

Let us know what you think of this or how you might use it. Leave a comment below, use the hashtag #povtech or email us at filmmakers@pov.org.

Get more documentary film news and features: Subscribe to POV’s documentary blog, like POV on Facebook or follow us on Twitter @povdocs!

Published by

Brian Chirls is the Digital Technology Fellow at POV, developing digital tools for documentary filmmakers, journalists and other nonfiction media-makers. The position is a first for POV and is funded as part of a $250,000 grant from John S. and James L. Knight Foundation. Follow him on Twitter @bchirls and GitHub.