Skip to main content

5 features that make awful smartphone cameras a thing of the past

iPhone 8 Plus vs Note 8 camera shootout header
Image used with permission by copyright holder

Smartphone cameras no longer produce the low-megapixel blurs of the past, but how exactly did they cross the line from simply being the most convenient to actually being good enough to shoot magazine covers? The camera-testing wizards at DxOMark have now been testing smartphone cameras for five years, and with that milestone comes five years worth of data on the tech inside our smartphone cameras. So what makes the smartphone camera of today so capable? DxOMark recently shared five technologies that have caused smartphone camera capabilities to grow exponentially in the last five years.

DxOMark
DxOMark
DxOMark
DxOMark
Recommended Videos

Better processors

An image sensor is nothing without the processor connected to it. This is the mini computer that turns the signal from the sensor into actual recorded data. Processors can do all sorts of things, but in general, the faster they are, the less noise (visual distortion) they will add to the image. The difference between the iPhone 5s and iPhone 6, hardware-wise, was only a change in the image signal processor — the sensor remained exactly the same — but that was enough for the iPhone 6 to capture images with less noise.

Noise is most apparent in low-light settings, but less noise also means more detail, particularly when digital noise reduction comes into play. Noise reduction is another thing the processor can do, but blurring away noise has the unfortunate side effect of blurring away detail as well. If a phone camera produces less noise, noise reduction can be dialed back, thus leaving more details intact. Not everyone agrees on whether less noise or more detail is better — for example, DxO says the Google Pixel 2 errs on the side of more detail with more grain, while the Samsung Galaxy 8 Note favors less grain but loses more details in the process.

DxOMark
DxOMark
DxOMark
DxOMark

Multi-shot HDR images

Phone cameras simply can’t fit the large sensors that DSLRs and mirrorless cameras use. Instead, they have to rely on software tricks to produce higher-quality images.

High dynamic range (HDR) imaging is a prime example of this. HDR requires multiple images to be shot at different exposure values and combined into one. For example, a camera may take three photos — one exposed properly for the shadows, one for the midtones, and one for the highlights — and then merge them into one photo that now holds detail across a wider range from dark to light. A process once limited to heavy-hitting desktop image-editing programs, many smartphone cameras today can now create HDR images automatically in the blink of an eye.

While HDR has been around in smartphones since 2010, DxOMark says the technology has accelerated over the last five years, leading to dramatic improvements. Facial detection is another feature that helps with exposure, as the camera now knows which part of the image to expose for. This feature was responsible for a big perceived jump in quality from the iPhone 5s to newer models.

Improved stabilization

Stabilization in a smartphone isn’t exactly new — but it has drastically changed over the last five years by integrating the smartphone’s gyroscope data into the feature. With that information, the stabilization algorithms require less processing and guesswork than using visual motion analysis alone. Another advancement, DxOMark says, uses an extra second of video as a buffer to actually expect the type of motion that will come next.

More recent phones also employ optical image stabilization, in which the lens or the sensor actually moves counter to the movement of the phone. This helps reduce shake from holding the phone, resulting in smoother video and sharper stills, particularly in low light where slow shutter speeds can otherwise lead to blur.

Faster autofocus

When DxOMark first started testing smartphones, the iPhone 5s wouldn’t adjust focus at all after a video started. Now, thanks to on-chip phase-detection autofocus — a more advanced focusing method that works without hunting back and forth — phone cameras can keep up with moving subjects much better and focus continuously.

The Samsung Galaxy S7 uses what’s known as a dual-pixel autofocus system, which is a form of phase detection that’s better for low light. (Most phones revert to the older contrast detection autofocus when there isn’t sufficient light for phase detection).

Google tried something even more unique in the first Pixel smartphone. That phone shines a beam of light on the subject and measures how long it takes for the light to return. This tells the camera how far away the subject is, and autofocus is set accordingly. However, a common complaint about this time-of-flight autofocus is that it doesn’t work well in bright light, so Google added in phase detection as a second autofocus system in the Pixel 2.

DxOMark
DxOMark
DxOMark
DxOMark

Dual lenses and computational photography

Many phones in recent years use not one, but two cameras — that is, two different lens and sensor pairs placed side by side. Using data from offset lenses allows software to fake an effect known as shallow depth of field, whereby the background is blurred by the subject is tack-sharp. While early attempts at this were decent, DxOMark says current-generation cameras do even better because they are more capable of producing better depth maps, thus reducing the amount of errors.

While the pace of the mobile imaging advancement is impressive, DxO says manufacturers are far from done adding better cameras to their phones. As phones grow faster and more capable, computational photography will likely improve, making phone cameras more powerful and giving users more control. We’re not there yet, but maybe one day, a smartphone really will be able to replace your DSLR or mirrorless camera.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Have an old Google Pixel? This camera test will make you want a Pixel 7a
The Pixel 7a and Pixel 4a's camera modules.

Three years separate the Google Pixel 4a and the Google Pixel 7a, and the internal specs and camera hardware are very different. But just how much do you notice when putting the two against each other and taking photos?

If you have a Pixel 4a and are thinking the Pixel 7a would be a good upgrade — or are interested to see how Google has advanced its camera and related software over the past three years — this test is for you.
How the cameras differ

Read more
This is what happens when you compare 4 phones in an 800MP camera test
The Galaxy S23 Ultra, Xiaomi 12T Pro, Redmi Note 12 Pro+, and the Motorola Edge 30 Ultra's camera modules.

While it’s still an attention-grabbing camera spec, there are now several smartphones with 200-megapixel cameras available to buy, with the best-known being the Samsung Galaxy S23 Ultra. But just because it’s the best-known, does this also mean it takes the best 200MP photos? To find out, we’ve put it against the Motorola Edge 30 Ultra, the Xiaomi 12T Pro, and the Redmi Note 12 Pro+ — all of which also have 200MP main cameras.

That's 800MP in total being flexed, and as there are considerable differences between the camera sensors, processors, software, and prices, the winner isn't as clear-cut as some more standard camera comparisons between two competing phones. Let's see how they all perform.
The 200MP cameras

Read more
5 things that could make watchOS 10 the perfect Apple Watch update
Apple Watch Series 8 showing its App Library.

With Apple’s 2023 Worldwide Developer Conference just weeks away, all signs point to it possibly being one of the biggest WWDCs yet. Though the main focus may be on Apple’s mixed reality headset, we’re still going to see software updates with iOS 17, iPadOS 17, watchOS 10, and macOS 14.

Over the weekend, Mark Gurman at Bloomberg stated that watchOS 10 may be a “fairly extensive upgrade,” with “notable changes” to the user interface. Although no details were provided, it teases an exciting future for the Apple Watch this year. And it also got me thinking about some changes that I would love to see in watchOS 10.
More watch faces with dark mode

Read more