Saturday, September 24, 2016

Photos and Anti-Photos

Simiao holds an inverse egg at the narrows
at Zion national park.
We don't give taking a photograph much thought these days.  We take our cell phone out of our pocket, point it at what we want to remember, and press a button.  We don't think about what's going on behind the scenes:  light hits the charge-counting device (CCD) for some amount of time, then at the end the image equals how many photons hit the charge-counting device over that period of time.  The world doesn't actually stand still for the CCD, though.  Time still passes, things move, life goes on.  The CCD, therefore, is actually reporting an average of what it saw while it was counting.  This can be apparent through motion blur or ruined photos if the photographer happened to be moving quickly while taking the picture.  When one takes a typical long exposure, this becomes even more pronounced because the transient things tend to disappear:  you get a picture of a city street with no cars, or a sidewalk with no pedestrians because the sidewalk and the street spent more time in the frame than the people or the cars.

What if we were able to take a picture of just the things that changed, and not the things that didn't?  You can perhaps imagine a landscape where you can see the clouds going by but not the mountains, where you could see the river going by but not the shore, where you could see the leaves blowing in the wind but not the tree trunks, and where you could see the people playing soccer but not the field.

This is what I call an anti-photo.

Part 1:  Solargraphs and Time-Lapse Video


This entire process can be done with a camera using a timer, but at the start of this project I actually just wanted something to do with old Android cell phones.  It turns out that an old cell phone is actually a great platform for taking long exposures and time lapses:  it's basically disposable because I wasn't doing anything with it anyway, it has a decent camera and amount of storage space, and is easy to set up as a time-lapse.

In order to take time lapses with an Android cell phone, I installed the excellent, free program Time and Tide - Lapse for Camera.  It can be configured to take a photograph at a configurable time interval, and gives you some essential controls over the camera settings:  for instance, if you intend on making a synthetic solargraph or anti-photo, you will probably want to turn off flash.  You will also want to set the focus to infinity, because otherwise the frame will subtly change over the time-lapse and lead to a blurry finished product once you composite the images.  It may also be worthwhile to set the ISO to 100 or otherwise something constant so that the same brightness on the image always corresponds to the same brightness of light.

Finally, simply set your camera up somewhere that it won't get bumped and let Time and Tide - Lapse run.  Once you have taken the desired series of photos, transfer them to your computer and the fun can begin!  It is straightforward to composite these images into a video in Linux using, for instance:

avconv -i %08d.JPG output.mkv





Next I wanted to make synthetic solargraphs from these collections of short exposures.  The solargraphs I made in previous posts are different from simply averaging the pixels in each photo in the series:  if I were to average each pixel, then a bright pixel in one photo would eventually be darkened if it was dark in subsequent photos.  In a solargraph, a pixel, once exposed, will never darken, only brighten with repeated exposure.  Thus, I wrote a special piece of code to sum up each pixel and then normalize the result to the brightest pixel in the series.  Here, for example, is a solargraph composite of the final video shown above:

A synthetic solargraph taken out the window of the old Google Chicago office.  Note how it has the tendency to sharpen the otherwise low-quality images coming from the repurposed Android phone.

The program that generated these images is provided for your utility.  It turns out that it is actually the same program that generates the anti-photos in the next section!

This is the point where the original intent of this post went awry.  It turns out that the re-purposed Android phones were great for doing traditional long exposures by taking many pictures and then integrating them with my solargraph program.  However, anti-photos were a bit trickier:  not only were the Android phones' cameras very noisy compared to a modern DSLR or electronic viewfinder camera, but the compression noise in the JPEG images they took made their images exhibit noise down to the pixel which made the resulting anti-photos unacceptable.

Part 2:  Anti-photos


It's now relevant to discuss why the same program generates both solargraphs and anti-photos, and how these two things are actually similar in some ways.

In a solargraph, every single pixel in a series of photos has its value added to that of the others, then a normalization happens at the end.  In an anti-photo, we only take into account pixels that have changed sufficiently from the previous photo; these are added to a running total, then normalized at the end.  The definition of "changed sufficiently" is left open to interpretation.  In an anti-photo, the definition of "sufficient change" is selected to be high enough so that in consecutive photos we throw out unchanged parts of the image.  If "sufficient change" was set high enough, then every picture would come out black because no pixel could ever change enough to trigger accepting it.  Generally, though, the higher "sufficient change" is defined to be, the darker the image will become.

Interestingly, if the definition of "changed sufficiently" is set to "no change", an anti-photo actually becomes a solargraph because every pixel is always included!  So, the same program is actually capable of doing both solargraphs and anti-photos.

Let's look at some examples!
A very early version of an antiphoto--I like that you can see the people walking down the sidewalk and the cars driving by, but there were a few issues with this series as well.  The compression noise at the edges of objects means that buildings are much more pronounced than they would be if I had shot in RAW mode.  The frame changed vastly in brightness over time besides, so you can see bright images in the reflections on the buildings.  Further, the intended subject--the people and the cars--were not well-centered in the shot.

Melting ice in three different glasses, illuminated by candelight.  You can see the shape of the surface of the ice cubes as they shrank and melted.
Little fish swimming around in a stream.  The fish are dark-colored, so taking a series of anti-photos revealed the bright stream bed rather than the dark fish.

The Chicago skyline in broad daylight.  During the series of photos, a cloud obscured the sun.  So, only the parts of the city directly illuminated by the sun (as opposed to illuminated by reflected light) survived the anti-photo procedure.  The net result?  Using the sun as the world's largest flash-bulb.

So, how do you take a good anti-photo, and how can you make your own?  The basic technique is simple.  Set your camera up on a tripod with a timer, then set the timer to take a picture periodically--perhaps every minute in the case of melting ice, or every second in the case of swimming fish.  Choose an interval that makes sense based on the rate at which your subject is moving.  Set the camera to have a constant shutter speed, a constant ISO, and set it to manual focus so that you always get the same frame and illumination.  Be sure to shoot in RAW mode, as the compression noise from lossy encoding like JPEG will add a significant amount of unwanted noise to the anti-photo algorithm.  As few as 30 photos can be assembled to make a great anti-photo.

The above techniques will go a long way towards taking an appropriate series of photos to make a good anti-photo, but I picked up a few extra tips from repeated practice:
  • Be sure to select a scene that, as much as possible, is uniformly illuminated over the series of photos.  Otherwise, you will pick up the changes in illumination as opposed to the change in subject.
  • Landscapes don't tend to make great anti-photos (the skyline above being a rare exception).  Try to select a subject that is the only thing changing in the frame, such as the fish or the ice.
  • The normal rules of framing a photo don't make sense for anti-photos.  Try to think about how the subject is moving and frame that motion, not the scene at large.  

Using my program to make an anti-photo is straightforward.  See the GitHub repository for a download link and instructions!

If you're into photography, I hope you will decide to try your hand and your eye at seeing the world's motion instead of seeing the world as a static image and make your own anti-photos.  When you do, please let me know what you come up with--I'll be happy to add some links to your work here!