Skip to main content

5 features that make awful smartphone cameras a thing of the past

iPhone 8 Plus vs Note 8 camera shootout header
Image used with permission by copyright holder

Smartphone cameras no longer produce the low-megapixel blurs of the past, but how exactly did they cross the line from simply being the most convenient to actually being good enough to shoot magazine covers? The camera-testing wizards at DxOMark have now been testing smartphone cameras for five years, and with that milestone comes five years worth of data on the tech inside our smartphone cameras. So what makes the smartphone camera of today so capable? DxOMark recently shared five technologies that have caused smartphone camera capabilities to grow exponentially in the last five years.

DxOMark
DxOMark
DxOMark
DxOMark
Recommended Videos

Better processors

An image sensor is nothing without the processor connected to it. This is the mini computer that turns the signal from the sensor into actual recorded data. Processors can do all sorts of things, but in general, the faster they are, the less noise (visual distortion) they will add to the image. The difference between the iPhone 5s and iPhone 6, hardware-wise, was only a change in the image signal processor — the sensor remained exactly the same — but that was enough for the iPhone 6 to capture images with less noise.

Noise is most apparent in low-light settings, but less noise also means more detail, particularly when digital noise reduction comes into play. Noise reduction is another thing the processor can do, but blurring away noise has the unfortunate side effect of blurring away detail as well. If a phone camera produces less noise, noise reduction can be dialed back, thus leaving more details intact. Not everyone agrees on whether less noise or more detail is better — for example, DxO says the Google Pixel 2 errs on the side of more detail with more grain, while the Samsung Galaxy 8 Note favors less grain but loses more details in the process.

DxOMark
DxOMark
DxOMark
DxOMark

Multi-shot HDR images

Phone cameras simply can’t fit the large sensors that DSLRs and mirrorless cameras use. Instead, they have to rely on software tricks to produce higher-quality images.

High dynamic range (HDR) imaging is a prime example of this. HDR requires multiple images to be shot at different exposure values and combined into one. For example, a camera may take three photos — one exposed properly for the shadows, one for the midtones, and one for the highlights — and then merge them into one photo that now holds detail across a wider range from dark to light. A process once limited to heavy-hitting desktop image-editing programs, many smartphone cameras today can now create HDR images automatically in the blink of an eye.

While HDR has been around in smartphones since 2010, DxOMark says the technology has accelerated over the last five years, leading to dramatic improvements. Facial detection is another feature that helps with exposure, as the camera now knows which part of the image to expose for. This feature was responsible for a big perceived jump in quality from the iPhone 5s to newer models.

Improved stabilization

Stabilization in a smartphone isn’t exactly new — but it has drastically changed over the last five years by integrating the smartphone’s gyroscope data into the feature. With that information, the stabilization algorithms require less processing and guesswork than using visual motion analysis alone. Another advancement, DxOMark says, uses an extra second of video as a buffer to actually expect the type of motion that will come next.

More recent phones also employ optical image stabilization, in which the lens or the sensor actually moves counter to the movement of the phone. This helps reduce shake from holding the phone, resulting in smoother video and sharper stills, particularly in low light where slow shutter speeds can otherwise lead to blur.

Faster autofocus

When DxOMark first started testing smartphones, the iPhone 5s wouldn’t adjust focus at all after a video started. Now, thanks to on-chip phase-detection autofocus — a more advanced focusing method that works without hunting back and forth — phone cameras can keep up with moving subjects much better and focus continuously.

The Samsung Galaxy S7 uses what’s known as a dual-pixel autofocus system, which is a form of phase detection that’s better for low light. (Most phones revert to the older contrast detection autofocus when there isn’t sufficient light for phase detection).

Google tried something even more unique in the first Pixel smartphone. That phone shines a beam of light on the subject and measures how long it takes for the light to return. This tells the camera how far away the subject is, and autofocus is set accordingly. However, a common complaint about this time-of-flight autofocus is that it doesn’t work well in bright light, so Google added in phase detection as a second autofocus system in the Pixel 2.

DxOMark
DxOMark
DxOMark
DxOMark

Dual lenses and computational photography

Many phones in recent years use not one, but two cameras — that is, two different lens and sensor pairs placed side by side. Using data from offset lenses allows software to fake an effect known as shallow depth of field, whereby the background is blurred by the subject is tack-sharp. While early attempts at this were decent, DxOMark says current-generation cameras do even better because they are more capable of producing better depth maps, thus reducing the amount of errors.

While the pace of the mobile imaging advancement is impressive, DxO says manufacturers are far from done adding better cameras to their phones. As phones grow faster and more capable, computational photography will likely improve, making phone cameras more powerful and giving users more control. We’re not there yet, but maybe one day, a smartphone really will be able to replace your DSLR or mirrorless camera.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
This is one of the toughest smartphone camera comparisons I’ve ever done
A person holding the Samsung Galaxy S24 Ultra and Xiaomi 14 Ultra.

The Samsung Galaxy S24 Ultra (left) and Xiaomi 14 Ultra Andy Boxall / Digital Trends

The Xiaomi 14 Ultra may have the best camera on a smartphone I’ve used this year, which is quite a statement to make considering the competition it faces. But is it true?

Read more
This smartphone camera sensor could make blurry photos a thing of the past
Metavision camera sensor tech.

Google Pixel 8 (left) and OnePlus 12 Andy Boxall / Digital Trends

Paris-based Prophesee made waves last year when it showcased its in-house event-based Metavision sensor tech for smartphone cameras. The core idea behind the stack was to make blurry images a thing of the past, demonstrating some impressive results during the development phase.

Read more
An unknown company just set a new standard for smartphone cameras
Tecno PolarAce Imaging System.

In addition to showing off some fascinating concept phones at Mobile World Congress (MWC) 2024, Tecno has also announced a groundbreaking new image processing system that could change the way we capture videos on our smartphones.

The innovative technology brand is calling it the PolarAce Imaging System, and it's powered by a Sony imaging chip that allows it to capture full 4K HDR video at 30 frames per second with AI-based noise reduction technology — an industry first.
A new era of smartphone video

Read more