Skip to content
Link copied to clipboard

Your smartphone photos are totally fake — and you love it

Night Sight on Google's Pixel, which shoots pictures in the dark, shows how phone cameras have become faketastic.

Google's smartphones now have a special camera feature known as Night Sight, which allows the phone to take photos in very dark situations. The Post's Geoffrey A. Fowler explains how this works, and why this new feature may be the future of photography.
Google's smartphones now have a special camera feature known as Night Sight, which allows the phone to take photos in very dark situations. The Post's Geoffrey A. Fowler explains how this works, and why this new feature may be the future of photography.Read moreWashington Post

The little camera on this phone has a superpower: It can see things our eyes cannot.

At night for the last few weeks, I've been tramping around dark places taking photos using a new mode on Google's $800 Pixel 3 called Night Sight. Friends in a candlelit bar look as if they brought a lighting crew. Dark streets are flush with reds and greens. A midnight cityscape lights up as though it were late afternoon. It goes way beyond an Instagram filter into you gotta-see-this territory.

Night Sight is a super step forward for smartphone photography — and an example of how our photos are becoming, well, super fake.

It's true, you don't look like your photos. Photography has never been just about capturing reality, but the latest phones are increasingly taking photos into uncharted territory.

For now, Night Sight is only a mode that pops up in dark shots on Google's Pixel phones. But it's hardly alone: All sorts of phone-makers brag about how awesome their photos look, not how realistic they are. The iPhone's "portrait mode" applies made-up blur to backgrounds and identifies facial features to reduce red eye. Selfies on phones popular in Asia automatically slim heads, brighten eyes, and smooth skin. And most recent phones use a technique called HDR that merges multiple shots to produce a hyper-toned version of reality.

When I recently took the same sunset photo with an iPhone 6 from 2014 and this year's iPhone XR, I was gobsmacked at the difference — the newer iPhone shot looked as though it had been painted with watercolors.

What's happening? Smartphones democratized photography for 2.5 billion people — taking a great photo used to require special hardware and a user manual.

Now, artificial intelligence and other software advances are democratizing the creation of beauty. Yes, beauty. Editing photos no longer requires Photoshop skills. When presented with a scenic vista or smiling face, phone cameras tap into algorithms trained on what humans like to see and churn out tuned images.

Your phone has really high-tech beer goggles. Think of your camera less as a reflection of reality, and more an AI trying to make you happy. It's faketastic.

Snapping a photo on a phone has become so much more than passing light through a lens onto a sensor. Of course, that hardware still matters and has improved over the last decade.

But increasingly, it's software — not hardware — that's making our photos better. "It is hyperbole, but true," says Marc Levoy, a retired Stanford computer-science professor who once taught Google founders Larry Page and Sergey Brin and now works for them on camera projects including Night Sight.

Levoy's work is rooted in the inherent size limitations of a smartphone. Phones can't fit big lenses (and the sensors underneath them) like traditional cameras, so makers had to find creative ways to compensate. Enter techniques that replace optics with software, such as digitally combining multiple shots into one.

New phones from Apple, Samsung, and Huawei use it too, but "we bet the ranch on software and AI," Levoy says. This liberated Google to explore making images in new ways.

"Google in terms of software has got an edge," says Nicolas Touchard, the vice president of marketing at DxOMark Image Labs, which produces independent benchmark ratings for cameras. (Whether any of this is enough to help the Pixel win converts from Apple and Samsung is a separate question.)

With Night Sight, Google's software is at its most extreme, capturing up to 15 lowlight shots and blending them together to brighten up faces, provide sharp details, and saturate colors in a way that draws in the eye. No flashes go off — it artificially enhances the light that's already there.

Anyone who's attempted a lowlight shot on a traditional camera knows how hard it is not to take blurry photos. With Night Sight, before you even press the button, the phone measures the shake of your hand and the motion in the scene to determine how many shots to take and how long to leave the shutter open. When you press the shutter, it warns "hold still" and shoots for up to 6 seconds.

Over the course of the next second or two, Night Sight divides all its shots into a bunch of tiny tiles, aligning and merging the best bits to make a complete image. Finally, AI and other software analyze the image to pick the colors and tones.

Night Sight had some difficulty with focus and in scenes with almost no light. And you — and your subject — really do have to hold that pose. But in most of my test shots, the product was fantastical. Portraits smoothed out skin while keeping eyes looking sharp. Night landscapes illuminated hidden details and colored them like Willy Wonka's chocolate factory.

The problem is: How does a computer choose the tones and colors of things we experience in the dark? Should it render a starlit sky like dusk?

"If we can't see it, we don't know what it looks like," says Levoy. "There are a lot of aesthetic decisions. We made them one way, you could make them a different way. Maybe eventually these phones will need a 'What I see' vs. 'What is really there' button."

So if our phones are making up colors and lighting to please us, does it really count as photography? Or is it computer-generated artwork?

Some purists argue the latter. "This is always what happens with disruptive technology," says Levoy.

What does "fake" even mean, he asks. Pro photographers have long made adjustments in Photoshop or a darkroom. And before that, makers of film tweaked colors for a certain look. It might be an academic concern if we weren't talking about the hobby — not to mention the memories — of a third of humanity.

How far will phones remove our photos from reality? What might software train us to think looks normal? What parts of images are we letting computers edit out? In a photo I took of the White House (without Night Sight), I noticed the algorithms in the Pixel 3 trained to smooth out imperfections by actually removing architectural details that were still visible in a shot on the iPhone XS.

At DxOMark, the camera measurement firm, the question is how to even judge images when they're being interpreted by software for features like face beauty.

"Sometimes manufacturers are pushing too far. Usually, we say it is OK if they have not destroyed information — if you want to be objective, you have to consider the camera a device that captures information," says Touchard.

For another perspective, I called Kenan Aktulun, the founder of the annual iPhone Photography Awards. Over the last decade, he's examined over a million photos taken with iPhones, which entrants are discouraged from heavily editing.

The line between digital art and photography "gets really blurry at some point," Aktulun says. Yet, he ultimately welcomes technological improvements that make the photo-creating process and tools invisible. The lure of smartphone photography is that it's accessible — one button, and you're there. AI is an evolution of that.

"As the technical quality of images has improved, what we are looking for is the emotional connection," Aktulun says. "The ones that get a lot more attention are not technically perfect. They're photos that provide insight into the person's life or experience."