Images from smartphone cameras seem very hit-or-miss right now. So, what’s going on?
YouTube sensation Marques Brownlee wants to know what is wrong with the iPhone camera lately. Even though he gave the iPhone 14 Pro an award for the best smartphone camera system of 2022, Brownlee says that the iPhone has been consistently placed in the middle of the pack when paired in a blind test with its competitors.
Sound confusing? That’s because color science is hard.
“A concern some of my people have had lately is that the iPhone camera feels like it’s been taking a few steps back lately,” Brownlee said recently. “And I agree with them.”
What does Brownlee mean by this? Let's break it down.
Smartphone Photos Aren't Really Photos
Brownlee points out that digital cameras have come a long way since they were invented in the '90s. He believes that cameras are not just cameras anymore but computers, and we all need to recognize that. To quote Brownlee, “[T]hese days, it’s turned into a computational event.”
While cameras would open and close a shutter for a set amount of time, a smartphone camera does more with software than with the actual hardware. The camera sensor no longer collects the light information and processes it into an image. Smartphone cameras take the picture several times in rapid succession and then merge those multiple light samplings into a master data set. Then, it performs a series of other tweaks, including tone-mapping, noise reduction, HDR processing, and other techniques, before combining it into a final image
Brownlee states that whenever a new phone comes out, we should pay closer attention to the new features that a smartphone camera offers since they are often software-driven rather than hardware. With the latest smartphones, the same techniques are being applied to video, where Cinematic modes are the latest to "upgrade."
If It's Not Broken, Don't Fix It
The issues stem from when it does break and you don't fix it. According to Brownlee, the Google Pixel offers a sound example of how this can go awry.
Google had the ideal combination of a 12MP image sensor, processor, and software to make great images. The combination was so good that the company didn’t change its camera from the Pixel 3 to the Pixel 6a, and it won awards for four years with the same aging hardware.
Then Google decided to up its game, and, as such, they went with a massive 50MP image sensor on the Google Pixel 6. The result was over-sharpened and saturated (what some people call "HDR-y") image quality. Brownlee says that Google was still using their software that was specifically tuned for the original 12MP camera. This resulted in an image that was way out of balance, highly sensitive, and over-processed.
It took a year for Google to perform the adjustments necessary to dial in the sensor with the Pixel 7 Pro to get the image quality back on track. Brownlee is seeing the same mistakes being made by Apple in the iPhone 14 Pro and its 48MP sensor, which, like Google and Pixel 6, is looking too processed. He expects that Apple will make the necessary changes in the iPhone 15 Pro and several software updates to bring the Apple camera image back on track.
A Reflection Of Ourselves
But Brownlee also sees this situation at the heart of how poorly smartphone cameras handle taking pictures of people with darker skin tones. With the majority of his camera tests done with himself as the subject, Brownlee has intimate knowledge of how well cameras across the boards reflect skin tones, sharpness, light fall-off, and other factors. From camera to camera, they are inconsistent in comparison.
“Using my face as a subject revealed a lot of how smartphones take a picture of the human face,” Brownlee says. “Smartphone cameras are so much software now that the image you get isn’t so much reality as this computer’s interpretation of what you want reality to look like.”
He believes that camera software pumps up the greens in landscapes, the blues in sky shots, and other situations. Apple is obsessed with evenly lighting the faces its software can identify. Often it looks just fine, but at other times the iPhone makes adjustments that look a little off like there was an artificial bounce fill that was added, and sometimes, it looks off-putting.
As we stated in a previous article, these smartphone companies did train camera algorithms with lighter skin tones in the early days. Now, companies like Google add calibration data with multiple skin tones to make image adjustments for a variety of skin tones. They call the process “Real Tone,” and that is a good thing.
One of the reasons that Brownlee says he is attracted to RED cinema cameras is their color science, which the YouTuber believes renders multiple skin tones more accurately. Do camera companies have a ways to go? Sure, but Brownlee says camera manufacturers are making strides, and it's worth acknowledging.
Let us know what you think in the comments below!
Your Comment