Here’s Why Your Phone Can’t Capture California’s Apocalypse Sky

And what it says about the problems with trusting your smartphone camera

Photos courtesy the author

If you live on the West Coast and woke up yesterday morning to our aggressively orange, smoke-tinged, apocalypse sky, you may have thought “Wow, this needs to be on Instagram.

But when you stepped outside with your phone to capture some pics to scare your East Coast friends, you were probably disappointed. A sky that appeared horrifically Martian in reality looked washed out and white-ish on your phone.

Why is that? The reason comes down to how your phone captures images. But even deeper than that, it comes down to something physical and fundamental: the color of white light.

If I asked most people what color white light is, they’d probably look at me funny and say “white.” But in reality, white light is never just “white.” It has a color temperature, measured in degrees Kelvin, which dictates the light’s exact hue.

According to camera maker Olympus, color temperature is defined as the “relationship between the temperature of a theoretical standardized material, called a black body radiator, and the energy distribution of its emitted light as the radiator is brought to increasingly higher temperatures, measured in Kelvin (K).”

If that makes zero sense to you, don’t worry. What you need to know is that white light moves from looking more reddish at lower color temperatures, to more blueish at higher ones. A deep, reddish candle is 2000k, whereas light reflected off blinding white snow is 8000k.

Color temperature is also why the light in a hospital looks different from the light in your living room. Bright, blueish lights used in commercial facilities are generally around 5000k. The warm, inviting yellow LED lightbulbs in your home (which are made to mimic the light of an old-school incandescent bulb) are around 2700k.

Humans are pretty good at adjusting to changes in color temperature. The color of light impacts us emotionally (try buying a 5000k LED bulb and putting it in your bedroom, and you’ll see what I mean), but normally we barely notice that white light has different colors.

Cameras, though, are a different story. By default, they don’t know how a scene is supposed to look, so color temperature can affect photographs in dramatic ways . If a camera’s color temperature setting is out of sync with the color temperature of the scene, you can end up with weird hues and color casts that make photos look unnatural.

“Here, let me fix that for you!” the software helpfully thinks.

Traditional analog films are “balanced” for a specific color temperature. Popular portrait films like Kodak Portra, for instance, are usually balanced for “daylight,” or around 5500k. But a handful are “tungsten” balanced so they can be used indoors or at night. With analog film, the color balance doesn’t change — the photographer chooses a film based on what they plan to shoot.

Digital cameras, on the other hand, can adjust their color temperature to match a variety of scenes. This process is called “white balance.” You’ve probably seen references to white balance if you’ve ever edited a photo in Photoshop, set up a webcam, or used a DSLR camera.

Here’s the issue, though. Today’s smartphones make liberal use of automatic image processing to help you get the best image out of your iPhone or Android device. In part, this is to compensate for the limitations of a phone’s tiny lens and minuscule image sensors. Without you even knowing about it, the software on your phone is performing trickery like layering multiple exposures together to create an image that’s neither too dark nor too light, fixing issues with blurriness in low light, and more.

In most cases, your phone’s software is also automatically setting the white balance on your photos. The iOS camera, for example, defaults to choosing a white balance setting for you. Normally, that’s great. It means that the photos of your new baby that you took under harsh hospital lighting come out looking natural, as do your dark, moody shots of a friend walking through narrow, neon-lit city streets at night.

But when conditions aren’t normal, these automatic settings can fail in a big way. When your iPhone sees the apocalyptic orange skies currently plaguing the Bay Area, for instance, it assumes you must be shooting indoors under a weird-colored incandescent light. “Here, let me fix that for you!” the software helpfully thinks, adjusting the white balance to even out the colors, removing the orange tinge from the sky, and making the scene look more normal.

The whole point, though, is that the scene is not normal. By trying to normalize it automatically, your phone ruins what would otherwise be a unique, compelling photo.

How often are our phones “normalizing” our photos on a daily basis, but we’re just not seeing it because it’s not as obvious as turning an orange sky white?

There are a few ways around this. One is to download a photo app that gives you manual control over the white balance of your phone’s camera, and set it to capture how the sky actually looks. In an article in SFGate, Maurice Ramirez, the official photographer for the town of Alameda, recommends Camera Plus or Procam.

You can also set your phone to take RAW photos, which capture all the details in a scene without making adjustments and then edit them later using an app like Adobe’s professional-grade Lightroom. Here, for instance, is a RAW photo from my Android phone taken today, with Auto white balance switched on. As you can see, the Auto setting makes the photo look nice and normal.

Auto white balance in Lightroom

Switching this to “Cloudy” adjusts the white balance so that a cloudy sky would look normal. Since today’s cloudy sky does not look normal, what you end up with instead is a properly balanced photo that captures the way the sky really looks right now — not quite as horrifically orange as yesterday, but still a weird straw-yellow.

Cloudy white balance

Of course, you could also just use a real camera. Professional cameras almost always capture RAW photos, so they’re not reading their own interpretations into a scene. And they have good enough optics that they don’t have to cheat by layering exposures, slowing down shutter speeds, or pulling other trickery to get a usable shot.

It might seem crass to worry about something like color temperature in the middle of a major disaster. But to me as a photographer, it brings up important questions about veracity that extend well beyond the current moment.

Because they have to work so hard to create usable shots, phone cameras aren’t always the best tool to document the world as it really is.

If we all go to document this disaster, and our phones literally won’t let us do so because the resulting photos look too “abnormal,” what does that say about the veracity of the photos we take on a daily basis? How often are our phones “normalizing” our photos on a daily basis, but we’re just not seeing it because the change is not as obvious as turning an orange sky white?

The inability to capture weird Martian skies is a passing annoyance right now, but it reveals something deeper and more important about phone cameras that’s normally invisible. Because they have to work so hard to create usable shots, phone cameras aren’t always the best tool to document the world as it really is. Without you even knowing it, they’re reading all kinds of things into the scene in front of you, and making all kinds of choices about how best to capture it.

For “normal” scenes, this can be fine. You don’t want to have to fiddle with a million settings when you’re snapping shots of your kid’s birthday party. But some of the best photos aren’t normal. By automatically normalizing everything you take, phone cameras can inadvertently push you toward a bland mediocrity in your photography, and away from that strange, unbalanced shot that might be the best one you’ve ever taken.

As a photographer, I can recognize when my phone is accurately capturing a scene, and when it’s not. Yesterday, when I set out to document the Bay Area’s skies, I left my phone in my pocket and used a professional mirrorless camera instead, since I knew that my phone would read too much of its own perspective into my photos. I also know how not to get carried away — professional news photographers follow a strict set of standards to ensure that the choices we make about editing our photos are fair and accurate.

If you have the means, consider trying out a professional camera, so you can make these kinds of photographic decisions for yourself, instead of entrusting them to Google’s algorithms. But even if you don’t, try taking the simple step of switching your phone into RAW mode, downloading an editing app, and experimenting with how tweaking different settings allows you to make your own (probably more interesting) choices in capturing a scene.

Today, you can likely get better, more accurate pictures for the ’gram. But once you see the limitations of your phone’s camera, you can also begin to move beyond them — both when the sky is on fire, and on days when things really are more “normal.”

Co-Founder & CEO of Gado Images. I write, speak and consult about tech, privacy, AI and photography. tom@gadoimages.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store