Dxomark mobile rating

How DxOMark Mobile scores smartphone cameras

DxOMark’s Mobile scores and rankings are based on testing every camera using an identical, extensive process that includes shooting nearly 1,500 images and dozens of clips totaling more than two hours of video. The tests take place in both lab and real-world situations, using a wide variety of subjects. Our scores rely on purely objective tests for which the results are calculated directly by the test equipment, and on perceptual tests for which we use a sophisticated set of metrics to allow a panel of image experts to compare various aspects of image quality that require human judgment. Testing a single mobile device involves a team of several people for a week. Once we have tested the device, we score the Photo and Video quality separately, and then combine them into an Overall score for quick comparison among the cameras housed in different devices.

We test every device in exactly the same way, in identically-configured lab setups, using the same test procedures, the same scenes, the same types of image crop rankings, and the same software. This means DxOMark results are both reliable and repeatable.

Outdoor test scenes take place in a variety of locations throughout metropolitan and suburban Paris.

DxOMark smartphone camera reviews, which include scores, sample images, and analyses, are fundamentally different from most camera reviews. Instead of being driven by a reviewer’s personal experience with and feedback about a camera, they are driven by the scores and analyses that come out of our extensive tests. We get a lot of questions about our scores, as well as about our testing and review process, so we’re sharing with you what goes into a DxOMark Mobile score, as well as the process we use for testing and for writing up the subsequent review.

DxOMark Overall score

The most frequently-cited score for a mobile device camera is the Overall score. It is created when we map the dozen or so sub-scores into a number that gives a sense of the device’s total image quality performance. An overall score is important, since we need to provide some way to rank results and have a simple answer for those not wishing to investigate further.

This graphic illustrates the overall process of creating a DxOMark Mobile score. The Overall score is created from a combination of the Photo and Video scores. Each of those scores in turn are a function—illustrated as f(x)—of a set of attribute sub-scores. As an example of what goes into each of these sub-scores, we have broken out the Color sub-score for Photo. You can see how it is based on tests performed under a variety of lighting conditions, including 20, 300, and 1000 Lux light levels. For each light level, the tests include both objective and perceptual scores that measure a variety of attributes. Similarly, we’ve broken out the Exposure sub-score as an example of how the Video sub-scores are also created from tests that we conduct under a variety of lighting conditions. These sub-scores also utilize both objective and perceptual tests that measure a variety of attributes, as well as incorporate both static and temporal (changing) scenes.

We weight the various results in a way that most closely matches their importance in real-world applications as judged by mainstream users — people typically interested in capturing family memories or sports — and by those who care about image quality above all else (whom we refer to as photo enthusiasts). We work with many different types of smartphone photographers to ensure that both the DxOMark Mobile Overall score and its sub-scores reflect what is important to them in their photography. Following from this, we test results — finished images — and not technologies. So, for example, autofocus and bokeh performance are tested independent of the method a particular camera uses to accomplish them.

This means that a device with very good overall scores may receive a higher rating than one which has a few excellent strengths and some noticeable weaknesses. However, every user is different, and each has differing priorities. For that reason, we encourage anyone reading our reviews to dig in past the Overall score into the sub-scores and written analysis.

We also get asked how a device’s Overall score can be higher than its sub-scores. The Overall score is not a weighted sum of the sub-scores. It is a proprietary and confidential mapping of sub-scores into a combined score. The Overall score is also not capped at 100. That just happens to be where some of the best devices are currently.

Photo and Video sub-scores

To help evaluate how well a smartphone camera will perform in specific use cases, we provide sub-scores for both Photo and Video, covering Exposure and Contrast, Color, Autofocus, Texture, Noise, Artifacts, Flash, and Stabilization (for Video). In 2017, we added Photo sub-scores for Zoom and Bokeh to reflect the advanced capabilities of many current mobile devices.

Which sub-scores are most important to you will depend on the types of photography you do with your mobile device. We compute sub-scores from the detailed results of tests that we perform under a variety of lighting conditions, and which include both scientifically-designed lab scenes, and carefully-planned indoor and outdoor scenes.

Sub-score categories

For each of our sub-score categories, we’ll explain what we are testing, something about the tools and techniques we use to test it, and some interesting details relevant to the testing process. In many cases, you can learn more about some of the hardware and software we use by looking at our DxO Analyzer product website.

Exposure and Contrast

Exposure measures how well the camera properly adjusts to and captures the brightness of the subject and the background. Dynamic range is the ability of the camera to capture detail from the brightest to the darkest portions of a scene.

Conditions & challenges: This sub-score matters most to those who shoot in difficult lighting conditions (such as backlit scenes), or in conditions with a large variation between light and dark areas (for example, sunlight and shade). Overall, it is the most important sub-score, as it affects all types of photography.

One of several outdoor scenes we use for testing dynamic range. DxOMark’s test charts include both industry-standard and proprietary targets. Charts for measuring dynamic range include this proprietary HDR target that combines both reflective and emissive sections.

Details: We assess and measure target exposure and contrast under varying light levels and types of lighting. We also calculate the maximum contrast values for each image. We conduct similar tests using carefully-designed indoor and outdoor scenes.


The Color sub-score is a measure of how accurately the camera reproduces color under a variety of lighting conditions, as well as how pleasing its color rendering is to viewers.

Conditions & challenges: As with exposure, good color is important to nearly everyone. In particular, landscape and travel photographs depend on pleasing renderings of scenes and scenery. Pictures of people benefit greatly from good skin tones, too.

A technician prepares to take test shots with a smartphone of a carefully-calibrated DxOMark custom lab scene.

We measure color rendering by shooting a combination of industry-standard color charts, carefully-calibrated custom lab scenes, and outdoor scenes under a variety of conditions. A closed-loop system controls the lighting to ensure accurate light levels and color temperatures.

Details: As with Exposure and Contrast, we test Color using a variety of light levels and types of light, but this time with the purpose of measuring how well and how repeatably the camera calculates white balance, and then how accurately it renders colors in the scene. We also measure uniformity of color. We do not penalize cameras for slight differences in color rendering that are deliberately introduced to create a particular look desired by the manufacturer. For example, some cameras maintain slightly warm colors to convey the atmosphere created by low tungsten light.


The Autofocus sub-score measures how quickly and accurately the camera can focus on a subject in varying lighting conditions.

Conditions & challenges:  Anyone who photographs action, whether it is children playing or a sporting event, knows that it can be difficult to get the subject in focus in time to capture the image you want. We measure a camera’s autofocus accuracy and speed performance in a variety of lighting conditions. The importance of this sub-score is directly related to how much activity there is in the photographs you tend to take.

DxOMark’s comprehensive autofocus setup is designed to accurately test the speed, accuracy, and repeatability of a camera’s AF system. The proprietary LED timing system on the right is essential to measuring both the delay before the camera focuses and the actual shutter speed used.

Details: When measuring autofocus performance, the camera needs to be defocused before each shot, and then the exact interval between when the shutter is pressed to when the image is captured has to be recorded accurately, along with the exact shutter time. In addition to our custom LED timer, we have created a system that uses an artificial shutter trigger and multiple beams of light to ensure the accuracy of our autofocus measurements.

Texture and Noise

The Texture sub-score measures how well the camera can preserve small details, such as those found on object surfaces. This has become particularly important as camera vendors have introduced noise reduction techniques (for example, longer shutter times and post-processing) that can have the side effect of decreasing detail because of motion blur and softening.

Conditions & challenges: For many types of photography, especially casual shooting, preservation of tiny details is not important. But those who expect to make large prints from their photographs, or who are documenting art work, will appreciate a good Texture sub-score. Expansive outdoor scenes are also best captured when details are accurately preserved.

The Noise sub-score measures how much noise is present in an image. Noise can come from the light in the scene itself, or from the camera’s sensor and electronics.

Conditions & challenges: In low light, the amount of noise in an image increases rapidly. If you shoot a lot in the evening, or indoors, then finding a phone camera with a good score — meaning that it keeps noise to a minimum — is very important. If you shoot primarily outside in good light, it won’t matter as much.

Details: Texture and noise are two sides of the same coin. Image processing to remove noise also tends to decrease detail and smooth out texture in the image. That is one reason we test using a wide range of lighting conditions, from 10,000 Lux all the way down to 1 Lux. We also use custom-designed scenes that include moving objects in addition to industry-standard static charts, so that we can take note of any side effects of excessively-long shutter times or excessive image processing. DxOMark’s texture charts include those that conform to various industry standards, including CPIQ (Camera Phone Image Quality standard). We carefully print and profile all charts before putting them to use.


The Artifacts sub-score measures how much a camera’s lens and digital processing introduces distortion or other flaws into images.

Conditions & challenges: While noise is primarily related to a camera’s sensor, artifacts are the distortions created because of its lens. These can range from straight lines looking curved, to strange multi-colored areas. In addition, lenses tend to be sharpest at the center and less sharp at the edges, which is also measured as part of this sub-score. A high artifact score (based on awarding points for keeping artifacts to a minimum) is most important for those who care about the overall artistic quality of their images.

We use both lab and natural scenes that include motion to look for artifacts caused by image processing algorithms that fuse multiple frames or which use other techniques that create image quality problems.

Details: Motion in test scenes is essential when evaluating artifacts, as many modern smartphone cameras fuse multiple images or images from multiple cameras to create a final result. If not executed correctly, these techniques can result in visible and distracting artifacts in complex moving scenes.

Flash (Photo only)

The Flash sub-score measures the effectiveness of the built-in flash (if any) in accurately illuminating the subject.

Conditions & challenges: Smartphone flashes are only useful at very short distances, and even at short distances, they may not be able to fully illuminate a subject. They can also cause unpleasant color shifts in the scene, since the color of the flash LED may not be the same as the other light in the scene. This sub-score is most important to those who capture a lot of close-up indoor photos, such as those of gatherings of family and friends.

Details: Modern smartphone flashes have become very sophisticated, sometimes using multi-color LEDs to attempt to match the color of the flash to the color of the ambient light. Our tests measure how well the flash performs both as a standalone light source, and when it is used as a flash fill along with natural or other light sources. Our flash performance tests take into consideration how well colors are preserved, how accurate the white balance is, and the characteristics of light falloff towards the edges of the image.

Zoom (Photo only)

Until recently, the only zoom capability in smartphone cameras was either a digital zoom that simply upscaled the image, or a crop if the original image came from a very high-resolution sensor. Now, however, an increasing number of mobile devices feature multiple cameras with different focal lengths. This gives them the ability to perform some types of optical or blended zooming, which means improved image quality over simple digital resizing.

Our new DxOMark Mobile protocol now includes tests to evaluate image quality when zoomed in at magnifications ranging from about 2x to 10x.

Conditions & challenges: Zoom is very helpful for achieving a natural perspective in portrait photos, as well as for capturing events at a distance (including most sports). Similarly, travel photography often benefits from zooming in on distant landmarks.

Different implementations of zoom (telephoto) vary in effectiveness based on the specific characteristics of the scene. Our tests include a variety of outdoor scenes as well as lab scenes.

Details: Since there are a variety of ways a camera can implement zoom (for example, using a telephoto lens, blending the images from two cameras, cropping a high-resolution sensor, and simple scaling), we test using a number of different focal lengths so as to highlight the strengths and weaknesses of each implementation. When the camera has a telephoto lens, it is often not as bright as the main lens, so some cameras switch to the main lens — even at their zoom setting — in low light, so we include not just outdoor and well-lit indoor scenes, but low-light tests as well.

Bokeh (Photo only)

Multiple cameras also permit depth estimation in mobile devices. This allows some of them to feature a “depth effect” or portrait mode that simulates the optical background blurring (bokeh) of traditional standalone cameras. (You can find a detailed description on how we test computational bokeh in this article.)

Our new DxOMark Mobile protocol evaluates bokeh by judging how accurately blurring is applied, as well as the quality and smoothness of the blurring effect. We use the bokeh of a wide-open, professional-quality prime lens on a DSLR as a reference standard — for example, an 85mm f/1.8 lens on a full-frame DSLR (which is in turn similar to a 50mm f/1.8 lens on an APS-C format sensor).

Conditions & challenges: Bokeh is an important element of many artistic compositions favored by photo enthusiasts. It can also reinforce the impact of portrait photographs.

As complex as this test scene for bokeh looks, it is only one of many configurations we use with each camera. Before our technicians are finished, they will replace the sharp-edged crown with a wig, then with a person captured in various poses. The changing configurations test the bokeh algorithm’s ability to distinguish foreground from background, and to estimate how distant objects are and how much they should be blurred.

Details: Testing for bokeh required entirely new test scenes and tests. We not only need to measure how well cameras blur the background, but how well they deal with a variety of tricky situations, such when the subject is holding onto another object, which can make it hard for the camera to separate the subject from the background. Some cameras use face detection to blur everything in the image that isn’t a face, but that approach is quite error-prone, and often leads to an aesthetically displeasing result.

Stabilization (Video only)

The Stabilization sub-score measures how well the camera eliminates motions that occur while capturing video.

Conditions & challenges: Unless you mount your phone on a steady tripod when shooting video, it will wobble a bit – no matter how carefully you hold it. In addition, you may even be shooting from a moving vehicle such as a bus or boat. For these and other reasons, handheld video often appears shaky. To minimize that effect, many smartphones offer either electronic or optical image stabilization when capturing video. A higher score here means steadier and more pleasing videos.

Details: Cameras may include optical stabilization, electronic stabilization, or both. Each type has pros and cons depending on the nature of the camera motion. Our protocol enables us to imitate different kinds of motion, including the camera shake that can occur during handheld shots, thus enabling us to test how successfully the camera’s stabilization system can compensate.

If you’re interested in more information about our DxO Analyzer testing solution, click here.

We’ve listed some of the most important use cases in the sections above for each sub-score, but for easy reference, here are the sub-scores that are most important for various types of smartphone photographers.

Travel and vacation photographers

Everyone likes to chronicle their adventures. Correct exposure is particularly important here, given the challenging high-contrast scenes typical of most outdoor locations. Color accuracy is also critical for creating pleasing images of landscapes and famous landmarks. Zoom matters for capturing landmarks or distant objects, as well as for capturing portraits of your traveling companions or locals.

Family memory-makers

In addition to vacation photos, family memories often require capturing events indoors, under poor or uneven lighting. Here, flash becomes critical, as well as low noise. Autofocus is key to capturing emotional moments, along with stabilization for on-the-fly videos of important events. Bokeh scores will help you determine how effectively your portraits will “pop.”

Recording sports

Action photography requires excellent autofocus, in addition to good exposure, for both still and video capture. Stabilization is also very helpful for video. Zoom is essential for most sports, as the action is often too far away to be captured with a mobile device’s default wide-angle lens.

Photo enthusiasts

While most photos never make it much past social media, photo enthusiasts often want to make larger versions of their photos as screensavers or prints. For them, in addition to the requirements for the type of image they are capturing, it’s important to have a camera with low noise and good texture preservation, as well as good contrast, and few artifacts. Excellent bokeh is also a key element for creating portraits or other artistic shots that highlight the subject.

These outdoor scenes are used to test repeatability and other criteria.

Each camera captures over 30 outdoor scenes several times each to measure repeatability. Each scene contains attributes that help us evaluate the perceived quality of the images for nearly every type of smartphone photography. We evaluate images using our unique perceptual scoring, which is based on our implementation of an image quality ruler methodology. Image quality rulers rely on the use of a set of known “anchor images” to which a test image is compared. This process is greatly enhanced by using experts to analyze the images and anchor images with content similar to the tested image. DxOMark’s team of image experts and large library of existing test images from many devices make it possible to get consistent and accurate results using this process.

Additional real-life scenes are designed to test performance in low light and under artificial lighting.

An additional 15 indoor scenes are designed to be particularly helpful in analyzing the suitability of the smartphone camera for family memory-makers and others working in artificial and low-light conditions. As with the outdoor scenes, we evaluate these indoor scenes using our perceptual methodology that employs image quality rulers.

DxOMark Mobile reviews

Our reviews are designed to provide you with both the scores and sub-scores for a device, along with analyses that may be useful in helping you compare multiple devices. We also often include comparison images that show how different cameras perform in the same situation, although in many cases those images are simply for illustration, and were not part of the actual tests. Similarly, we often include images to illustrate a particular point, or for you to evaluate, that may not have been used in testing. Since launching the new version of DxOMark Mobile, however — and this is important — we have been able to publish some of the full-resolution original images we used in testing so that you can make your own comparisons.

What camera modes do you use for testing?

We do all testing using the default mode of the smartphone and using its default camera application. The exceptions are the manual activation of flash, zoom, and portrait (depth or bokeh) mode when evaluating those capabilities. Aside from the difficulty in testing an arbitrary number of camera apps and modes, this approach reflects the experience of the vast majority of smartphone photographers, who use their devices in the default setting, and also of the phone vendor, who presumably makes an effort to have the default setting present the camera at its best.  

Do you review pre-production cameras?

All our published mobile device reviews are written based on tests performed using commercially-available devices with current production firmware, the same as a retail buyer would receive.

Why are some devices tested more quickly than others?

First, we need to be able to acquire a production unit and have time to test it. Also, we do have to prioritize our efforts, given the large amount of time and resources each test and review require, so devices with a large market impact tend to be tested more quickly after launch.

How much time does it take to test and review a mobile device camera?

The test process involves a team of engineers for about ten days, capturing photos and video both in the lab and outdoors, analyzing data, and providing perceptual evaluations. Then our writers need additional time to create a review from the test results and to publish it along with sample and comparison images.

What are you going to do now that devices are already getting scores of 100?

Our scores are not capped at 100. We’ve already had a couple devices score over 100 in certain categories.

How come the Overall score is sometimes higher than the sub-scores?

The Overall score is not a weighted average of sub-scores, it is a mapping based on the sub-scores.

Why does some of the sample image metadata say that the image was created on a computer?

Smartphone vendors have started experimenting with additional options for rendering richer color and wider tonal ranges. In particular, some are using the new HEIF image format instead of JPEG, and the DCI-P3 colorspace instead of sRGB. In the case of HEIF, we need to convert the images to JPEG for you to see them in your browser. We do not do that with the images used in testing, so this conversion only affects the copies of the images we upload for your reference. Similarly, we need to convert DCI-P3 images to sRGB for them to look good on many consumer monitors, but this doesn’t impact the tests nor the test results, only the image versions we provide with the reviews.

Why should zoom and bokeh affect the Overall score?

As smartphone buyers come to expect additional advanced performance capabilities, it makes sense to include them in the DxOMark scores and rankings. Advanced zoom capability and portrait mode are two of the most popular new features that have been added to smartphones in recent years. If they are not important to you, then of course don’t worry about those sub-scores and instead rely on our other sub-scores for each device.


DxOMark Mobile test protocol and scores

Skip to content

The DxOMark Mobile test protocol takes a different approach to our sensor and lens test protocols. Rather than looking at a discrete component (sensor, lens) the Mobile protocol assesses the performance of the imaging pipeline in its entirety, including lens, sensor, camera control, and image processing. This means that still image quality evaluation in the Mobile protocol is based on the camera’s JPG output rather than on RAW image data as for sensors and lenses. The protocol also covers the test device’s performance in video mode, and scoring is calculated from both objective measurements and from perceptual evaluation of more than 50 challenging and realistic indoor and outdoor scenes by a panel of imaging experts.

Testing for most image quality criteria is undertaken with the smartphone camera in its default (auto) mode. We intervene manually only to activate the bokeh (portrait) mode, the flash, and the zoom feature.

The new protocol includes the evaluation of camera performance in a wide range of real-life situations.

We designed a dedicated studio setup for the new bokeh test.

DxOMark Mobile: What’s new for 2017?

With the mobile sector being a major driver of innovation in imaging, and smartphone camera technology rapidly evolving since the original DxOMark Mobile test protocol launched in 2012, the time has come for an update. The new version of DxOMark mobile takes into account the latest developments in mobile imaging technology and adds the following components:

  • The new zoom test analyzes image output at three subject distances and in several test situations (outdoors, indoors, and in the lab). Key criteria for judging the results are resolution and image detail, noise, color reproduction, and image artifacts.
  • A new bokeh test analyzes the camera’s ability to create a pleasing background blur while keeping the image’s main subject in focus. The test evaluates the strength, smoothness, and transition of the blur effect. It also looks at distortion on portrait subjects, artifacts, and the image processing’s depth estimation capabilities.
  • When testing the autofocus and sharpness, the updated test protocol now takes into account motion in the scene and the speed of image capture. Sharpness on moving subjects with a moving photographer is measured in a simulation setup in the lab. A new autofocus test evaluates not only accuracy but also speed and texture, and we now undertake noise tests shooting handheld as well as shooting with the device mounted on a tripod.
  • For the updated low-light analysis, we take still image samples at very low light levels between 1 and 20 Lux. For our video quality analysis, we record all low-light footage in between 5 and 20 Lux. Overall light levels for most photo tests now range from 1 lux to 1000 lux, and color temperature from 2300 to 6504 Kelvin.
  • We have expanded our DxOMark Mobile video analysis to include objective measurements in addition to perceptual evaluation for most test criteria. For many tests — for example, white balance, exposure, and noise, we now also measure how quickly and smoothly the camera adapts to changing light conditions during recording.

Perceptual analysis is undertaken by a panel of experts using image quality rulers.

A new studio test measures the camera’s speed of capture — that is, the time between pressing the shutter and image capture.

DxOMark Mobile Scores

The DxOMark Mobile Photo and Video sub-scores and the overall score are generated from a combination of objective lab measurements and perceptual analyses. The following test criteria feed into the Photo sub-score:

  • Exposure and contrast, including dynamic range, exposure repeatability, and contrast
  • Color, including saturation and hue, white balance, white balance repeatability, and color shading
  • Texture and noise
  • Autofocus, including AF speed and repeatability
  • Artifacts, including softness in the frame, distortion, vignetting, chromatic aberrations, ringing, flare, ghosting, aliasing, moiré patterns, and more.
  • Flash
  • Zoom at several subject distances
  • Bokeh

The Video sub-score is calculated from the following criteria sub-scores:

  • Exposure
  • Color
  • Texture and noise
  • Autofocus
  • Artifacts
  • Stabilization

Scores are presented in the format as shown below.

For a better idea of how the updated protocol impacts device scoring in practice, please have a look at our 2012 vs. 2017 score comparisons for a range of devices (including the Google Pixel, Nokia 808 and iPhone 7 Plus) that we tested using both test protocols.


2017, the year of the new mobile protocol

Not only has 2017 been an exciting year for smartphone cameras, with all the big name manufacturers launching new products, embracing innovative technologies, and delivering a better-than-ever mobile imaging experience to consumers, it has also an extremely busy time for DxOMark, especially during the second half of the year. We launched a new website design and an updated version of our DxOMark Mobile test protocol that takes into account improved device performance and new technologies that weren’t around when we first started testing smartphone cameras in 2012. Among other new criteria, the new protocol incorporates zoom performance, bokeh quality, capture of moving scenes, and image quality in very-low-light conditions.

When we announced the new protocol in September, we also published the results of our retests of several iconic smartphone cameras that had performed very well in our tests using the original protocol — for example, the Apple iPhone 7, the Samsung Galaxy S6 Edge, and the Nokia 808 PureView. In this article, we want to give you an overview of all the smartphone camera reviews based on the new DxOMark Mobile test protocol that we have published since then.

Overall results

In the graph below you can see all smartphones that we have tested with the new DxOMark Mobile protocol. The ones marked in yellow are the older models that we’d previously tested using the original DxOMark Mobile, and which we retested for the launch of the update. As you can see, with a score of 98 points, Google’s Pixel 2 follows in the footsteps of its predecessor, which was last year’s top smartphone. Its overall best performance puts it at the top of our ranking this year. Apple’s new anniversary model iPhone X and the Huawei Mate 10 Pro with its Leica camera follow very closely behind the Pixel 2.

Other high-end models, such as the Samsung Galaxy Note 8 and the iPhone 8 Plus put in very good performances as well. With the Xiaomi Mi Note 3, a mid-range device came impressively close to considerably more expensive flagship models. Unsurprisingly, at the other end of the rankings we find entry-level smartphones that cost only a fraction of the latest Apple, Google, and Samsung flagships. That said, the Indian Lava Z25 and the Chinese Gionee S10L performed remarkably well, considering their price points.

Some of the older retested models were capable of holding their own as well. That’s not much of a surprise in the case of the HTC U11, which is still the Taiwanese manufacturer’s flagship and which occupied a top position in our old ranking just before the switch to the new protocol. However, it’s quite impressive that the Nokia 808 PureView, a device launched in 2012, is still able to compete with some current budget phones. Without a dual-camera setup or any computational imaging-based features, it is far from the level of current high-end models, however, despite the largest image sensor of all cameras in this comparison.

Photo performance

Looking at only still image performance, the situation is very similar to the overall score. However, some positions have been swapped and the podium is entirely occupied by dual-camera equipped phones that can produce better zoom quality and a more natural bokeh effect than the single-lens competition. That said, the scores between the leading devices are extremely close.

Exposure, color, and dynamic range

All of the high-end devices we tested with our updated protocol deliver good exposure, color, and contrast in pretty much all light conditions. Auto HDR modes are activated by default on all flagship models, allowing for the capture of good shadow and highlight detail in difficult high-contrast scenes. In the sample scene below, you can see how the Google Pixel 2 and the iPhone X are capable of recording a very wide dynamic range. The Samsung Galaxy Note 8 also does a good job, just avoiding highlight clipping in the brighter parts, but it captures noticeably less detail in the shadows, and shows stronger contrasts than the other two cameras.

At the other end of the budget scale, entry-level devices often don’t have the processing power to make Auto HDR a default option, or they simply rely on less-efficient HDR processing than flagship models. As a result, photographing high-contrast scenes is much more of a challenge, with a lot of detail lost in the brightest and darkest parts of the frame. In the sample below, both cameras fail to record almost any detail in the brightest areas. The Meizu Pro 7 Plus image shows a little more tonal range in the shadows than the Micromax, however.

Exposure in very low light is another area where spending more money on a device currently still gets you noticeably better results. Within our updated DxOMark Mobile protocol, we test exposure at light levels as low as 1 Lux. Thanks to fast apertures and multi-frame image stacking, most high-end models such as the iPhone X in the sample below on the left, are capable of producing slightly underexposed but usable images in such light conditions. Budget devices such as the Gionee S10L generally have to make do without those advanced features, and thus have much more difficulty recording good levels of detail in very low light.

Texture and noise

Detail and noise are two image quality parameters that are strongly interlinked, and improving one of them typically has a negative impact on the other. All of the high-end models in our testing are capable of recording images with detailed textures and low noise levels in all light situations. However, different manufacturers take different approaches to texture and noise, prioritizing one over the other. In our testing, the Google Pixel 2 achieved a class-leading texture score of 73, but a slightly lower noise score of 59. With a score of 65 for texture and 75 for noise, the Samsung Galaxy Note appears to take a reverse approach. Apple’s iPhone 8 Plus is the most balanced in this comparison, scoring 65 for texture and 69 for noise.

In the samples below, you can see what those numbers mean in terms of real-life image quality. In bright light (1000 Lux), the Pixel 2 delivers excellent detail with fine textures and precise edge definition. The Samsung image is noticeably softer when viewed at a 100% magnification, and the iPhone 8 Plus image output falls pretty much between the other two.

Crop of Google Pixel at 1000 Lux Crop of Samsung Galaxy Note 8 at 1000 Lux Crop of Apple iPhone 8 Plus at 1000 Lux

The advantages of Samsung’s low-noise approach become more obvious in low-light shooting, however. Compared to the Google and Apple devices, the Note 8 produces a noticeably smoother image with less luminance noise (which is particularly noticeable on the skin tones) at a light level of 5 Lux.

Crop of Google Pixel 2 at 5 Lux Crop of Samsung Galaxy Note 8 at 5 Lux Crop of Apple iPhone 8 Plus at 5 Lux

In comparison, budget-oriented smartphones, such as the Micromax Canvas Infinity (which scores 46 points for texture and 45 for noise), are capable of producing decent detail in bright light, but struggle as the light gets dimmer. A loss of detail and increased noise levels are very noticeable in the 20 and 5 Lux samples below, plus the camera has trouble capturing bright enough exposures at those low light levels.

Crop of Micromax Canvas Infinity at 1000 Lux Crop of Micromax Canvas Infinity at 20 Lux Crop of Micromax Canvas Infinity at 5 Lux

Image artifacts

The imaging pipeline in a smartphone camera is a very complex construct comprising sensor, lens, and image processing. Image artifacts can be caused by any of these components. Thankfully, on most devices we have tested with the new protocol artifacts are well-controlled and can only be spotted when examining an image at full size. Below you can see a few examples of the more intrusive artifacts we have found during our testing. For example, the Google Pixel 2, like its predecessor, has a tendency to produce flare when shooting against a strong light source. We found an unusual blocking artifact in some outdoor images that we captured during our test of the Huawei Mate 10 Pro. You can spot it in the sample below in the blue sky to the left of the building.

Flare in the Google Pixel 2 image Blocking artifacts in the Huawei Mate 10 Pro image

While high-end smartphone cameras mostly offer good sharpness across the frame, more affordable devices frequently struggle with sharpness. The Meizu Pro 7 in particular suffered from inconsistent sharpness across the frame. Although many mobile devices in its class suffer a loss of edge sharpness, the Meizu can display a loss of sharpness in random parts of the picture, including in the center, which seriously affects image quality.

A significant artifact problem for the Meizu Pro 7 Plus is inconsistent sharpness across the frame.

Autofocus performance

All high-end devices we have tested since our switch to the new test protocol come with on-sensor phase-detection autofocus and/or laser-assisted systems. As a result, the AF systems in almost all flagship devices worked swiftly and produced sharp and repeatable image results. Below you can see our test results for the Huawei Mate 10 Pro at 20 Lux. As you can see, even in these fairly low-light conditions, its autofocus system is capable of producing consistently good results at high speed. At 96 points, the Huawei snapped up our second-highest autofocus score per the new protocol, and was surpassed only by the Google Pixel 2 at 98.

Huawei Mate 10 Pro autofocus performance at 20 Lux

At 73 points, the Meizu Pro 7 Plus was one of the weaker devices in this category. Shooting 30 consecutive frames in low light (20 Lux) in the lab, and defocusing each time, it was quick to fire after both a short or a long delay, but sharpness was inconsistent, with some shots slightly or completely out of focus, particularly after a short delay.

Meizu Pro 7 Plus autofocus performance at 20 Lux

Zoom and bokeh

Zoom performance is one of the new components in the updated DxOMark Mobile test protocol. Until fairly recently, most smartphone cameras were limited to a simple digital zoom that had a very deleterious effect on image quality. But now some current flagship models come with new technologies for improved zoom performance. The Apple iPhone 8, for example, comes with a secondary tele-lens that doubles the focal reach of its main camera. The Huawei Mate 10 uses the high 20Mp resolution of its secondary monochrome sensor and computational imaging to capture a high-quality digital zoom image. In the comparison below, you can see that the Apple’s 2x zoom image shows better detail than the equivalent shot captured with the Huawei Mate 10 Pro. However, the latter’s system is still a noticeable improvement over a more conventional single-lens system and digital zoom such as that as used in the Google Pixel.

Apple iPhone 8 Plus, 2x zoom Huawei Mate 10 Pro, 2x zoom

Bokeh quality is another new test criterion and again, an area in which dual-cam equipped devices generally have an advantage over single-lens systems, thanks to better depth-sensing capability. In the samples below, you can see how the iPhone X’s bokeh simulation in Portrait mode produces a natural and pleasant background blur effect, with a realistic shape to circular spectral highlights that’s similar to optical rendering. Slight depth estimation failures result in parts of the portrait subject being slightly blurred — although not as blurred as the background; and overall, the bokeh effect is pretty good, earning the iPhone X the best score of all tested devices in this category at 55 points. The Samsung Galaxy Note 8 achieved a lower score of 45 points, but image results are still decent, with a strong background blur effect, reasonably good subject masking, and excellent depth estimation.

Samsung Galaxy Note 8 bokeh effect

It takes more than just two lenses to generate a natural-looking bokeh effect, however. This becomes obvious when looking at some of the results from more affordable devices, such as the Meizu Pro 7 Plus or the Gionee S10L. The blur intensity is very strong using the former’s bokeh simulation mode, which can make for a striking effect, but it’s not very natural. Further, the shape of the bokeh is pleasing in some areas, but depth estimation failures can result in visible artifacts around the subject. The latter’s bokeh results leave even more room for improvement: subject isolation is poor, and despite the camera’s face detection feature, even the portrait subject’s face is sometimes partly blurred. Background blur tends to be unnaturally strong, and the transition between sharp and blurred areas is very abrupt, generating quite unnatural-looking images overall.

Meizu Pro 7 Plus bokeh effect

Video performance

When looking at video performance in isolation, the picture is slightly different than the photo scores. The Google Pixel 2 has a 5-point lead at the top, and even its predecessor still occupies the third place in our rankings. The Samsung Galaxy Note 8 and iPhone X drop a few places, while Apple’s iPhone 8 models perform on a similar level as for stills. The same is true for the Huawei Mate 10 Pro, which achieves excellent results in both categories. Most high-end models we tested in recent months offer reliable autofocus and smooth exposure and white balance transition in changing light conditions. However, dynamic range and stabilization when walking or panning can still be a challenge.


It’s easy to see that current flagship smartphones are better than ever, and now that smartphone zoom performance has radically improved, there are even fewer reasons to carry a digital compact camera. It’ll still take some time before smartphones can capture DSLR-like image quality, but it is encouraging to see the massive steps that mobile imaging engineers have taken in terms of low light performance and bokeh quality in very little time.

There is still a considerable image quality gap between flagship devices and entry-level smartphones, however. The latter are generally capable of recording nice images in good light conditions, but struggle in low light and with advanced features such as zoom or bokeh simulation. So while smartphone imaging performance is good as never before, there are still a lot of areas of potential improvement across all device categories. We are looking forward to having a closer look at and testing what the manufacturers will surprise us with in 2018.


Introducing the new DxOMark Mobile test protocol

When we launched our original DxOMark Mobile test protocol in 2012, among the first smartphone cameras we tested were the Apple iPhone 4 and 5, the Samsung Galaxy S2 and S3, the Nokia 808, and the HTC 8X. Back in the day, those were all high-end devices with very capable cameras.

Apple’s iPhone 4 was the first smartphone we tested on DxOMark Mobile. Its 5 Megapixel camera featured a 1/3.2-inch sensor with a 1.75 µm pixel size and a f/2.8 aperture.

A lot has happened in the five years since then. Smartphone photography has pretty much eradicated the compact camera market segment, and the mobile sector has become the number one driver of innovation in imaging. The latest smartphone models come with camera technology and features that have slowly but surely brought our original test protocol to its limits.

One of the last devices that we tested using the old DxOMark protocol, the HTC U11, combines a large 12 Megapixel 1/2.55-inch sensor with a fast  f/1.75 aperture, optical image stabilization and phase detection AF.

  • Advanced CMOS image sensor technology and more powerful chipsets allow for real-time multi-frame processing, resulting in substantial improvements in low-light performance.
  • On-sensor phase detection and laser-based time-of-flight sensors have drastically improved autofocus speed and precision, especially in low-light conditions.
  • Dual-camera setups have brought optical zooming capabilities and bokeh-like features that simulate on smartphone cameras the shallow depth-of-field of a fast lens on a DSLR. Other dual-cams combine the image data from an RGB sensor with the information captured by a secondary monochrome chip for improved detail, reduced noise levels, and increased dynamic range.

Apple’s iPhone 7 Plus was one of the first smartphones to offer optical zoom in a dual-cam setup:

To take these developments into account and to be prepared for the future, we have redesigned the DxOMark Mobile test protocol to include new and expanded outdoor and lab test scenes and evaluation methodologies. With the new protocol, we capture and analyze more than 1500 images and two hours of video for each device. Compared to the previous protocol, it provides the following new elements:

  • A new zoom sub-score that is based on extensive testing at multiple focal lengths
  • A new bokeh sub-score based on lab and outdoor testing
  • Low-light testing down to 1 Lux
  • Motion-based test scenes for more accurate evaluation of camera performance and processing techniques in real-world situations

Many smartphone components, such as displays and chipsets, have become commoditized in recent years, but camera quality is more than ever a key differentiator for manufacturers and a major decision criterion for smartphone buyers. The new DxOMark Mobile test protocol enables consumers to make informed buying decisions by evaluating the true camera performance of the latest mobile devices and by challenging OEM marketing claims.

You can find a detailed overview of the new DxOMark Mobile test protocol here. Click here for a comparison of the original and new protocols, looking at the scores of some of the best performing smartphone cameras. The smartphone reviews that followed the previous DxOMark Mobile protocol are still valid for evaluation purposes, but please keep in mind that you cannot compare their test results with those obtained with the new protocol.


Смотрите также