iPhone Xs Max
One of my favorite shots with the new iPhone XS Max shows what the new Smart HDR tech can really do. Stan Horaczek
SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Learning the art of photography has always involved at least a little math. Whether it’s the relatively straightforward calculations involved with figuring out your exposure settings or the nearly inscrutable jumble of numbers required to navigate the settings on an old school flash, there were always numbers behind the process. What the hell is a foot-candle anyway?

Modern cameras, however, involve even more mathematics than anything from the past. In fact, Apple’s new iPhone XS Max camera is doing “trillions” of computations, but you, the shooter, never really see any of them. In fact, it’s only thanks to a new feature (which has nothing to do with the actual mechanics of photography) that the iPhone camera even mentions a once-fundamental photographic concept like f-number, which tells you how much light your lens can let in through its aperture.

I’ve been shooting with the iPhone XS Max camera for a few days and while it’s certainly one of the best smartphone cameras I have ever used, it took a little work to get my brain to abandon its traditional camera habits and embrace the future of computational photography. We’re in a world where taking one photo really involves taking many in rapid succession and letting a computer cram them together into a single shot. It’s a concept that changing what it means to look for “good light.”

A street-style photo takes advantage of the iPhone’s wide-angle camera, which still offers lots of depth of field and focuses quickly thanks to hybrid autofocus pixels integrated into the sensor itself. Stan Horaczek

The tech inside

On the spec sheet, the new iPhone camera doesn’t seem profoundly different from the hardware found inside the original iPhone X. It’s still a dual-camera setup with one wide-angle lens and a secondary telephoto lens that gives you a more zoomed-in field of view.

The imaging sensors that actually catch the light for the image are now slightly bigger than they were in the previous model. Apple doesn’t say exactly how much bigger, but the resulting change has made both cameras slightly wider in terms of field of view. For the camera nerds: The wide lens now acts like a 26mm lens, while the telephoto lens is now just 52mm.

The resolution stays at 12 megapixels, but each pixel on the sensor is now deeper to capture light more effectively, which is important when trying to shoot in low-light. That’s an area in which these relatively tiny sensors have always struggled.

The most impactful piece of hardware when it comes to overall camera performance, however, is the new image signal processor on the phones A12 Bionic processor that powers its new Smart HDR tech.

The iPhone keynote explained Apple’s new approach to HDR with every photo. Apple

Computational photography

Every time you take a photo with the iPhone XS Max, you’re actually capturing several images. First, it saves several frames from before the moment of the button press from a buffer that’s constantly running in the background. Then, it takes even more photos, including the main reference frame and an image with a longer exposure time to try and capture extra details from the shadows. This is an example of computational photography in which the processing engine combines raw image data into the final photo.

The processor then takes that mass of data and crunches it together into a single image file. It’s a far cry from the once-mechanical process of opening a little door in front of a piece of film and letting in light for 1/60th of a second.

The iPhone XS Max does a good job keeping some intense colors in check here. That orange is sometimes challenging, especially under direct light and will look unnatural. Stan Horaczek

Multi-shot HDR isn’t new in and of itself—and it has been the standard way to shoot with the iPhone since the iPhone X first debuted—but now the system is looking for discrete elements in the frame, like faces, which are always important, or blur nebulous lines that could indicate blur. Then it’s trying to fix them.

The results are images that look bright and vibrant, but if you’re used to images from a traditional camera or a DSLR, they take a little getting used to. The shadows have more detail, but since they’re not as dark, they sometimes also lack impact of a nice dark section of the image.

A photo with a bright blue, cloudless sky typically means dark, hard shadows and abundant contrast, but the iPhone brightens up those dark areas to the best of its ability. The reward is lots of detail, but the cost is that sometimes things look more like a screenshot from World of Warcraft than a typical photograph.

Portrait mode now lets you adjust the level of blur in your photos after you shoot them. Stan Horaczek

Portrait mode

The upgraded photographic algorithms in the iPhone XS Max have also improved the Portrait Mode feature, which applies blur to the background of an image to mimic the look of a professional lens with a fast aperture.

We first met Portrait Mode back in the iPhone 7 Plus and it has absolutely come a long way. In fact, now the iPhone XS Max allows you to tweak the amount of blur that you add to the background of your photos after you shoot them. Just like you can brighten up a shot in post, you can now make the background sharper or blurrier depending on your tastes.

In many cases the effect works rather nicely. Shooting a model on a scenic pier at golden hour in New York City, it was great. Putting a subject against an otherwise-distracting background also makes a good use case for this feature. It has some of the same benefits of a truly fast lens with a wide aperture to let in lots of light.

Portrait mode works just fine with an equivalent aperture setting like f/4.5 to keep some detail in the background. Going to f/1.8 too often will introduce too much blur and you’ll get tired of the look. Stan Horaczek

But, again, there’s a disconnect. When you put something in front of your subject—a common framing technique used by portrait photographers—portrait mode won’t grab onto it and blur it to match the background. I totally get why, but it’s another adjustment for me when framing a portrait.

I find it particularly interesting that Apple chose to label its adjustable blur modes with an f-number. While it’s typically true that a smaller f-number translates into a shallower depth of field (and more blur) it also typically has other effects on an image, like darkened corners and shorter exposure times. So, while it’s fun to say you can “change your focus” after you shoot, it’s not really the case. You can add or subtract more blur, but you can’t create a sharp image if you missed focus no matter how far you push the slider.

It’s worth noting that you get the same portrait blur effects from the 7-megapixel front-facing camera as well, so your selfies will look classier than they did before.

When you start pushing the portrait mode to its limits, you start to see the faults. This is an admittedly tricky situation for it to figure out since the flowers to the left are on the same focal plane and ended up sharp. It also had some trouble clipping her hair in a believable way. Stan Horaczek

So, does the iPhone XS have a good camera or what?

The short answer is yes. The new iPhone camera is great. It focuses and shoots quickly, which is great for street photography. The extra exposures and the redesigned lens seem to translate into extremely sharp photos. And with the Smart HDR, you do get more details in your images, even if that does sometimes affect the image of an image that would otherwise benefit from some truly dark shadows.

This scene had some very hard direct light. The shadows are strong, but the scene isn’t as contrasty as I’d expect with a typical camera thanks to the Smart HDR. I would finish an image like this by darkening the shadows even more. Stan Horaczek

But there’s also a learning curve. Portrait mode is really fun, but if you’re going to use it a lot, you’ll find that focusing and shooting in that mode is decidedly slower than taking a typical shot. The system has to find the subject, then figure out how to apply depth to the scene before it snaps the photo. Also, now that we’re introducing all of this blur—simulated or otherwise—into image, you’re likely to miss more frequently and get a blurry photo that didn’t have camera shake to blame.

And if you’re a camera purist, then you probably have some work to do getting used to this new era of computational photography. Google has been doing this same kind of work with its Pixel Visual Core and other smartphone cameras are adding even more camera modules so they can crunch more photographic data every time you push the button.

This hot dog stand was in very hard light, which is good for something with bright, blocky colors like this. You can see tons of detail, which is looks great. Stan Horaczek

Last note

Whether you love or hate the look of the iPhone camera’s image, you’ll be glad to know that smartphone camera flashes are still hilarious. The iPhone XS Max does an OK job with its LED-based lighting solution, but flash photography is still well and truly a discipline for the dedicated camera photographers out there. Here are two pictures of a dumpster to remind you of that fact.

A picture of a dumpster with no flash. Stan Horaczek
A picture of a dumpster with flash on. Stan Horaczek