Ever wondered what it would be like to take photos at zero gravity? This week, cinemagraph expert and space buff Armand Dijcks explains the challenges of capturing timelapses from the International Space Station and why NASA’s footage lends itself perfectly to cinemagraphs.
I’ve recently been working on a series of 4K cinemagraphs that are, quite literally, out of this world. They were created from images captured by the crew of the International Space Station. In recent years I’ve been fortunate to be able to do some work for Dutch astronaut André Kuipers, who visited the ISS twice. During his second mission, he and his crew mates captured almost half a million images of planet Earth.
Having worked with this vast image library to create time lapse segments, I thought they would provide a really great subject for cinemagraphs as well. In contrast to a time lapse film, a cinemagraph allows you to take in the view and gaze at it for as long as you like. To make this happen, I had to overcome a few challenges, but we’ll get to that later. First, let me take you on a virtual trip to the space station and have a look at how these amazing images are captured.
Photography in space
The ISS is equipped with an array of professional DSLR cameras and lenses large enough to stock a small camera store. Fortunately for the astronauts, a pro-level camera body with a hefty telephoto lens attached weighs nothing in space and can easily be used for hand-held shooting. But that’s about the only advantage for space based photographers.
Being outside the atmosphere, the dynamic range of light conditions is extreme, from the deep black of space to the blinding, unfiltered sunlight. This makes it very tricky to capture time lapse sequences, as the light constantly changes during each 90 minute orbit around the Earth.
Furthermore the cameras, just like the space station crew, are exposed to radiation – high energy particles that can cause damage to individual pixels in the sensor. After a camera has been in space for a while, it’s sensor is riddled with hot pixels and dead pixels – pixels that are either permanently “on”, or no longer work at all. The interesting thing is that those hot pixels look almost like stars, so they’re not even that obvious in images of the sky, until you start compiling them into a time lapse. Suddenly, half the stars start moving (the real ones) and the other half stay in place (the hot pixels).
The ISS traverses the planet at a decidedly zippy 17,000 mph (almost 28,000 km/h), which means that you have to use very short shutter speeds in order to keep things sharp. On the night-side of Earth, that can mean a fine balancing act between shutter speeds and ISO settings, to keep the images both relatively sharp and noise free.
The images shot by the astronauts are transferred to Earth more or less right away. Even though the ISS has surprisingly decent bandwidth, the crew of mission 30/31, whose images I’m using, were politely requested by NASA to go easy on the time lapse shooting because of the excessive amounts of data they were producing.
Putting it all together
The work flow for creating the cinemagraphs was relatively straightforward, but there were some specific challenges. One of them is wading through hundreds of thousands of raw images to identify usable time lapse sequences. Not every time lapse sequence makes for a good cinemagraph. Very obvious changes in light or terrain make it hard to find a good loop, and distract from the experience. Another unexpected challenge were the moving solar panel arrays on the space station, that sometimes block part of the view, and are very hard to loop convincingly.
Once a sequence was selected, my first step was to do some basic raw processing of the original images, mainly adjusting levels, highlights and shadows. I wanted to keep the look as natural as possible. Software like Lightroom automatically recognizes hot or dead pixels and removes most of them, but in some cases, some extra tweaking of the raw setting were required because of extreme amounts of noise.
The processed images then get compiled into a full resolution time lapse sequence, which I import into Final Cut Pro X for resizing, cropping, and some subtle color correction if need be. The original resolution of the images is almost 5K, which results in a very nice and crisp image when reduced to 4K resolution or, in some cases, leaves room for a bit of creative cropping.
Whoah, not so fast!
The most important step however, was to slow down the tempo of the time lapses. You have seen various time lapse compilations that people have created from the ISS footage, and usually it looks as if planet earth is zooming by at warp speed. The images, often captured at a rate of 1 per second, are played back at 24 or 30 frames per second, making things move 24 or 30 times as fast as in reality. For these cinemagraphs I wanted to give the viewer an experience that’s closer to what you would actually observe from the space station. This meant dramatically slowing down the time lapses to a fraction of their original speed. I did this is Final Cut Pro using the “optical flow” setting, that interpolates between existing frames to create new ones. In some cases I managed to get the speed down to real time, in which case the software sometimes needed to make up over 30 frames between each pair of existing ones. In some other cases this resulted in horrible morphing effects, making the results unusable.
After slowing it down, the footage has a much more zen-like quality, which works very well for cinemagraphs. At this point I exported the various shots into Cinemagraph Pro and tried to find a good loop. This was surprisingly difficult for some footage that contained cloud patterns, so I used a long overlap to blend the beginning and end of the loop. A sequence of the Aurora Australis on the other hand was surprisingly easy to loop. The auroras turn out to form naturally repeating patterns, almost as if they were made for cinemagraphs.
Check out Armand’s gallereplay profile for more of his cinemagraphs!