A photo shared by NFL player Prince Amukamara (below) shows that photography still has a long way to go when it comes to accurately portraying people with darker skin tones. The rise of computational photography means that, especially when it comes to smartphone snapshots, the technology to enable more accurate skin tone representation is there. 

It should also be said that photography has a racist and exploitative past (and present) in many other ways as well, from inappropriate portrayals of African children being used to win photo competitions to Black photographers being discriminated against professionally—it wasn’t until 2018 that an African-American photographer, Tyler Mitchell, shot a Vogue cover. For today, though, we’re just going to limit ourselves to one of photography’s racist facets: skin color reproduction.

What’s up with the photo

In the photo he shared, Amukamara is seen (or rather isn’t seen) standing between Aaron Rodgers and Brett Favre, the current Green Bay Packers quarterback and his predecessor. While it’s a push to say Rodgers and Farve are well exposed, they are at least visible.

In this situation, the smartphone has automatically metered off the highlights, rather than the shadows or the mid-tones. The lamp in the background is the best-exposed thing in the image, at the expense of Amukamara. 

Now, this is a challenging scene to capture accurately with any camera, let alone a smartphone. It’s a poorly lit room, flash is necessary, and the photographer’s finger is half-covering the lens. But that doesn’t make it okay. 

We can recognize that the bad image is a result of technical decisions made mostly by the smartphone (not the photographer), while still saying that it’s not good enough and that manufacturers need to do better. Rodgers and Farve could handle being a bit brighter if it meant Amukamara was visible. 

A problematic past

Unfortunately, this is not a new thing. Humans obviously have a wide range of skin tones. To get some idea of the variety, check out the fantastic Humanae project by Brazilian photographer Angélica Dass. For it, Dass has shot more than 4,000 portraits in an attempt to “document humanity’s true colors” as matched to the Pantone palette. 

But when it comes to accurate images, the photography industry has traditionally confined itself to the lighter end of the spectrum. Take Kodak’s “Shirley cards”.

These were the color calibration cards sent out to print labs so they could ensure their setup was printing “accurate” colors. At least, accurate as far as the model on the front was concerned. 

As the video above from Vox explains, things got better in the 70s and 80s as furniture and chocolate manufacturers complained to Kodak about how their products were being rendered. But, as Amukamara’s image shows, we’re still dealing with variations of the same problem today. 

What’s to be done about it?

If you’re a photographer and you’re taking photographs of people with darker skin tones, you owe it to your subjects to correct your camera’s inherent exposure bias. Seriously, your camera’s meter will—by default—tend to underexpose black, brown, and other darker shades of skin.

You can adjust for this in post, or better yet, by exposing to the right (ETTR). You will get better images that way anyway. 

Things are a bit trickier with smartphone snapshots and other casual photos where you aren’t planning to do much editing to begin with. The photographer who took the photo of Amukamara, Rodgers, and Farve clearly intended for all three to be visible. Their mistake was trusting their smartphone to pull it off. 

In general, if you find yourself in a similar situation where the lighting is bad, shoot a few shots at your smartphone’s suggested metering and a few with the exposure bumped up a bit. One of them is likely to be more usable than the viral photo above. 

Ask manufacturers to do better

As camera and smartphone manufacturers are automating more features, it’s their duty to make sure the whole human spectrum of skin tones is taken into account. Google is actually leading the charge here. 

Its Real Tone feature with the Pixel 6 phone uses face detection, improved auto-white balance and auto-exposure, stray light reduction, and face sharpening to get better photos of people with darker skin tones. An ad highlighting it (ironically) ran at the Super Bowl this year. 

We want to see Apple add similar features. 

And for camera manufacturers like Canon, Sony, and Nikon? While photographers are always going to demand manual control of many aspects of the image creation process, there’s no reason we can’t demand that automatic modes, guided modes, and other similar features do a better job of capturing portraits—regardless of skin tone.