logo
SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

For decades now, Popular Photography has tested cameras and lenses so that you can make a more informed decision about the equipment you use to capture images. Whether you’re documenting your life or creating art, you need to know not only which camera or lens is best suited to your shooting style, but also how that camera or lens performs so you can use it more effectively.

Our tests are designed to measure not only the quality of the images the camera or lens produces, but also to report on its usability, design, and standing against comparable models. We perform our lab testing in the Popular Photography Test Lab, located in our New York City office. A temperature- and humidity-controlled environment, it ensures the same conditions for every camera or lens tested there. Our lens test bench rests on a set of pneumatic legs, so it floats on a cushion of air to ensure that any vibrations in the building won’t taint our test results.

We perform the lab-based portion of our testing in the same way (or exactingly equivalent ways) for all makes and models, so that each item is subjected to the same rigors under the same conditions. This way, our test results allow you to compare the results from camera to camera and lens to lens. We trust that, armed with this information, people can better decide which model might be best for them, and gain some insight into the best way to use what they eventually choose.

Lighting
We illuminate our resolution, color-accuracy, and noise test targets with Dedolight DLH400D daylight-balanced HMI lights. These lights offer the best approximation of daylight we know of, and are powered by flicker-free ballasts. While many photographers are using daylight-balanced fluorescent lights for portrait and studio work, these don’t reproduce the spectrum of light as well as HMI lights do and are not a true continuous light source, since they flicker. And with the extremely short shutter speeds often required for photographic testing, that flicker would create too much fluctuation for us to accept them as a light source in our tests.

Common Test Settings
We shoot our resolution, color accuracy, and noise tests in RAW, with the camera on a tripod and set to the AdobeRGB color space. We use the camera manufacturer’s 50mm f/1.4 lens at f/8 with shutter speed varying based on ISO. We use single-shot autofocus, centerweighted metering, and bracket to ensure that we obtain a proper exposure, checking against patches in the test targets to select the best exposure.

We convert these images into uncompressed TIFFs for analysis using the RAW conversion software that comes with the camera. We apply only minimal changes from the manufacturer’s default settings in its software. Our reasoning here: Camera makers are selling their customers a package that includes software, and the default settings should be a statement of how they think the camera should perform. In the case of noise reduction, for example, we expect that the manufacturer will set the defaults at settings that make the best tradeoff between retention of fine detail and adequate reduction of noise for that model. Furthermore, any other attempt to adjust that setting would prevent us from having a standardized test procedure.

We should mention that in the case of Nikon, we use one of two different 50mm f/1.4 lenses. For those Nikon bodies that accept only lenses with built-in focusing motors, such as the D5100 and D3100, we use the AF-S Nikkor 50mm f/1.4G. If the DSLR under test can accommodate lenses without their own focusing motor, we use the older AF Nikkor 50mm f/1.4D.

Also of note here: Any DSLR body will behave differently with different lenses. Our AF test (and other tests) provides a way to limit that variable by always using the same lens for a given system. The only exception to this rule is when testing interchangeable-lens compacts (ILCs), since there are currently no 50mm f/1.4 lenses for these systems. Because the lens selection for these systems is still so limited, we have been using the wide-to-normal kit lenses for each. We also have not implemented rigorous AF testing of these systems yet.

Following, we detail each of the individual lab tests we perform here at Popular Photography. We also always use every test camera extensively in the field to see how it performs in everyday shooting conditions and to assess any functions—such as focus tracking of subjects moving toward or away from the camera—that are impractical to test in our lab.

Color Accuracy
To measure a camera’s color accuracy, we shoot the Macbeth Color Checker DC test target using that brand’s 50mm f/1.4 lens at f/8, and at every ISO the camera offers, in whole-stop increments. We do two passes, one with the camera’s automatic white-balance setting and one with a custom white balance set using a gray card. Typically, the automatic white balance setting garners the best results with our lights, but if that’s not the case we make a point to note it in the test.

After shooting the test target, we convert the resulting image into 16-bit and 8-bit TIFFs using the RAW conversion software provided by the manufacturer. We allow fine-tuning of the white balance with a dropper on an 18-percent gray patch of the target if the RAW software allows this, as most do. Other than that, we leave the manufacturer’s default RAW conversion settings intact.

Once the images are converted, we crop down to the center of the target area, within the alternating grey, white, and black patches, so that those repeating patches are not overly weighted in the measurement. We then use Chromix ColorThink software to compare the images of the target we shot against a reference scan of the test target generated on an Epson Expression 10000XL flatbed scanner, which has a color gamut well beyond that of the cameras we test. We update this scan several times a year to ensure that any subtle changes in the color patches of our chart are reflected in the reference we use for the test. As color test targets are exposed to light during the tests, it is natural for the patches in the target to change very slightly. Whenever the target is not in use, we keep it in a protective, light-tight sleeve to minimize its exposure to light.

The ColorThink software generates an average Delta E value for the images we shot, which is what we report in our test results. Delta E is a standard unit of measure for the difference between specific colors, governed by the International Commission on Illumination (CIE); a Delta E score of 1 represents the least amount of color difference that is discernable by a trained observer when two color patches are placed side by side. Most people would require a Delta E of 2–3 before noticing the difference between two colors, maybe even more so if the two patches are not side by side.

(Interestingly, most color films are not necessarily color-accurate. Manufacturers of film often accentuate certain colors over others, or boost saturation overall, in order to create a color palette considered more pleasing by their customers. So digital cameras offer an ability to capture colors in a much more accurate way than film ever did. Given the ease and flexibility of modifying digital images after the fact, the range of colors available to photographers has never been more extensive or customizable than it is today.)

In the near future, we will to migrate to the new Macbeth Color Checker SG, which has more skin-tone patches and has taken the place of the Color Checker DC target in X-Rite’s line of Macbeth color charts.

Noise
When you turn up the sensitivity of the sensor in a digital camera, you get noise—either off-color splotches or white (or overly bright) dots. Manufacturers use a variety of means to minimize the effect of noise, though this process often results in a loss of resolving power, as the noise is either blurred away or otherwise masked.

Our noise test isn’t concerned with how the noise is generated or how the manufacturer attempts to minimize it. Rather, it looks to measure how much variance there is from the grayscale patches in our test target. This is measured in standard deviation from each of the various patches on our X-Rite Gretag Macbeth Color Checker target.

We shoot that target at all the camera’s ISOs, unless there is a sensitivity setting that reduces the pixel count. We typically do not test those settings, which are found primarily in cameras without interchangeable lenses. After shooting, we convert the RAW images into TIFF files using the software provided by the manufacturer, and with the manufacturer’s default noise-reduction settings. We then process the resulting image with the most recent version of DxO Analyzer software from DxO Labs. The software generates the average standard deviation, which we report.

In the future, we will migrate to DxO’s 15-patch noise target. This is a transmissive rather than reflective target, illuminated from behind by daylight-balanced fluorescent bulbs within a lightbox. The benefit of the new target is that the patches are glass, and thus have very smooth surfaces. DxO Labs, the maker of the DxO Analyzer software, has determined that in cameras with very-high-megapixel sensors, the texture of the surface of the paper used for a test target may show up as noise in the test. Since pixel counts continue to increase, we will be shifting to this target this year and will make note of it here when we do.

When we process the images for the noise test, we apply the default amount of noise reduction in a given manufacturer’s RAW conversion software. Not only does this allow us to set a standard for the way we process our test images, it also encourages manufacturers to provide a reasonable starting point for end users when processing their images. Since manufacturers should know their cameras best, especially when first introduced, we expect that they will apply the amount of noise reduction that provides the best tradeoff between noise reduction and resolution preservation for a given camera model at a given ISO. Some camera makers apply very little noise reduction, or they don’t vary the amount of reduction based on the ISO, in which case we say so in the test. This way our readers know to pay attention to how much noise reduction is applied to their images, and why a given camera’s noise numbers might look odd compared to what they normally see in for its class of camera.

Resolution
People often associate resolution with the number of pixels found on a camera’s imaging sensor. While it is true that the potential for increased resolution does come with an increase in the pixel count, the resolving power of a camera system relies on more than just the sensor (or film, in the case of film cameras). That’s why we rely on actual images of resolution targets in our resolution test. As with some of our other tests, we shoot the resolution test with the camera brand’s 50mm f/1.4 lens at f/8, and at every ISO the camera offers, in whole-stop increments. If a camera’s sensitivity range starts or stops at a nonstandard ISO, we also test that ISO as well as every standard whole-stop ISO above or below it.

We list resolution only at the camera’s lowest ISO setting in the test results. This is mostly because we simply have limited space in the magazine. Instead of listing resolution at every ISO, we typically mention the point at which resolution falls below a certain threshold or we note how much the camera was able to preserve resolving power as ISO increased. We do this to underscore the variability of resolving power based on processing, and to provide more useful commentary to our readers.

We use Applied Image Inc.’s QA-77 target chart, an update of the ISO-12233 chart we had used prior to this one. The main difference between the two is that the QA-77 chart allows us to measure up to 4000 lines per picture height, while the older chart extended to only 2000 lines.

Autofocus Speed
A unique piece of equipment in the Popular Photography Test Lab is our custom-built AF-speed test apparatus. It determines how long it takes for an SLR to focus and capture an image. We set up the tripod-mounted camera, with 50mm f/1.4 lens attached, facing a liquid-crystal shutter that is opaque in its natural state but becomes clear when electricity is applied. On the other side of the liquid-crystal shutter is a very simple target: a bright white board with its left half covered in black velvet to create a vertical border of high contrast. A switch turns on the electrical current to the liquid crystal shutter while simultaneously triggering a timer embedded in the upper corner of the focusing target.

When the camera has locked focus and captures a picture, the time it took to do so (to the nearest hundreth of a second) is captured on the timer display, visible in the image.

We test the camera’s focusing speed at various light levels to show how it slows as the light level dims. Specifically, we test at EV 12, 10, 8 , 6, 4, 2, 1, 0, –1, and –2. At all those light levels in which a camera can reliably and consistently focus, we calculate the average focusing times to plot them on a graph in the test results. Sometimes a camera will become less consistent in dim light—if it does, we mention it in the text. We perform the AF speed test at least 20 times for each light level in the test, though often many more times than that. Once a camera fails to focus at particular light level, we end the test and tabulate the results.

The AF test is one of only two sets performed in the Popular Photography Test Lab that doesn’t use the HMI lights described earlier. (The other set is performed on our lens bench, which has its own light source.) The AF test uses an array of six 1000-watt Altman tungsten lights controlled by sliding dimmer control from Lighting Methods Inc. and powered by a Theater Technologies Inc. power supply.

The test is performed using only the center AF point, which is typically a high-sensitivity cross-type sensor or, in some cases, a dual cross-type point (essentially two cross points overlaid and working in tandem). This presents a best-case scenario for a given camera and should be interpreted with that in mind. The great advantage of this method is that we are able to compare any DSLR to another, as they are all on a level playing field.

Ultimately, the AF testing is a way to track a manufacturer’s products over time. Because we use the same lens (whenever possible) on all the different bodies from a given manufacturer, readers can make informed decisions about which body might be best for them within a given line. Plus, it serves another purpose of product testing in general—to keep manufacturers on a path of innovation and improvement, with the hope that photographers will ultimately have better tools at their disposal. With a general sense of how a given body will focus, you can also make loose comparisons between brands. While it doesn’t represent an absolutely perfect way to compare one DSLR’s focusing capabilities to another’s, we think it’s the best way possible given the number of variables that go into AF.

Viewfinder Tests
Whenever a camera being tested has an optical viewfinder, we test that finder’s accuracy and magnification. The accuracy test compares what is framed in the finder with the actual resulting image recorded by the camera. Most finders do not show you everything that will be captured, though some offer 100% accurate framing.

To perform our accuracy test, we frame our resolution target precisely on the framing guidelines of the target in the camera’s finder and capture an image. We then use the ruler tool in Photoshop to compare the size of the captured image to the size of the whole frame, and calculate a percentage.

To measure magnification, we shoot an image of the resolution target through the finder with a 50mm f/1.4 lens on the camera being tested, using another camera also outfitted with a 50mm f/1.4 lens. We then remove the camera being tested and shoot another image of the resolution target from the exact same position that we shot the image through the finder. Then we use the ruler tool in Photoshop to compare the size of the image shot through the finder with that of the other image to get the magnification factor.

Field Testing
Our testing doesn’t end with the lab. We also take every camera out into the field and shoot with it to see how the meter responds to various scenarios, how well the continuous AF functions, if the claimed number of images per burst is true, how the ergonomics of the camera, how easily you can access necessary functions in the menu system in a real-world scenario, and much more. The only way to fully understand a camera is to use it extensively, and that is what we do.