Samsung introduced the Galaxy S7 and S7 Edge yesterday at Mobile World Congress with a tremendous amount of fanfare. While notable improvements such as the ability to withstand water, expandable storage, and a bigger battery are sure to excite, the camera is where the real innovation lies.
After showing its camera prowess on last year’s Galaxy S6, Samsung doubled down with the Galaxy S7 and S7 Edge by improving it even more than we had imagined.
Both phones feature a dual pixel image sensor, and while we have already seen this technology in DSLRs, we have never seen it in a smartphone.
In a nutshell, dual pixel technology speeds up the autofocus by using 100 percent of the pixels, whereas traditional smartphone cameras use less than 5 percent. It almost seems absurd to have that kind of an improvement, but Samsung’s dual pixel sensor splits every single pixel into two photodiodes for phase and contrast.
So when capturing a photo on the Galaxy S7 or S7 Edge, the light from the lens is sent to two image sensors independently to adjust the focus, which is very similar to how the human eye works.
No pun intended here, but the dual pixel image sensor really outshines in low light situations. The Galaxy S6 already had fantastic low light performance, but the Galaxy S7 kicks it up more than a few notches. It captures 95 percent more light than the Galaxy S6 thanks to a 56 percent upgrade in the size of the pixels (1.44 microns) and an aperture of f/1.7.
Combine the dual pixel image sensor with the additional light, and the result is a smartphone camera that can autofocus super fast in any lighting environment.
Take a look at the two videos below to get a good idea of how much more improved the Galaxy S7 is over the Galaxy S6.
We’re looking forward to getting more hands on time with both phones to find out if they live up to the hype. Stay tuned.