AppleApple iPhone 13 ProFeaturesiPhone 13 Pro MaxMobileSamsungSamsung Galaxy S22 Ultra

I revisited the iPhone’s Portrait mode, and I don’t like what I see


Apple is gearing up to announce a bunch of products at its September 7 event, and chief among them will be the iPhone 14 series. We’re expecting a refined design, a new chipset, and — of course — battery improvements.

Portrait mode is one of the most used camera features on smartphones, and is something Apple brought into the spotlight with the iPhone 7 Plus. With the iPhone 14 right around the corner, I thought it was a good time to pitch the iPhone 13 Pro Max against the Galaxy S22 Ultra to see how their Portrait modes perform — and what Apple needs to get right on the next flagship.

Here’s how the current Apple offering performed against the best of Samsung.

Humans

Judging the first two shots, the iPhone 13 Pro Max got the tonality right. It is closer to real-life colors, whereas the Galaxy S22 Ultra overexposes the image. But the second set of shots indicates that the iPhone image is more contrasty. It’s a tricky shot where the light is on half the face from the window on the left, while the other half of the face is underexposed.

Samsung tackles this by overexposing the overall images, which would work better for social media, in my opinion. On the other hand, the iPhone shot has a more natural bokeh. If I had to go with a shot to post on social media, I would pick the Galaxy S22 Ultra, but the iPhone 13 Pro Max gets the colors right. The latter also does better with edge detection, which is apparent from the hair in the first two shots.

Overall, for the iPhone 14 Pro series, if Apple could expose the portrait shots in tough lighting situations in a better way, that would be a home run, as everything else is on point.

Food

When comparing the next four shots, you will notice two major things. First, the iPhone cannot get the portrait mode right on food items. The edge detection is all over the place as it blurs out the bowl and plate in the third image. Moreover, if you look closely, the first image is soft and lacks details, which are present in the shot from the Galaxy S22 Ultra.

Second, the Galaxy S22 Ultra can take a few seconds before getting the portrait mode right on the food items. And if you are in a hurry to get the correct shot, the device just won’t get the bokeh, which you can see in the second image.

In both images from the iPhone, you’ll notice that it can’t get the subject right as it blurs out the top right corner of the items on the tray. By contrast, when the Galaxy S22 Ulta gets the bokeh right (within a few seconds), the edge detection is noticeably better than the iPhone 13 Pro Max. It’s like the iPhone is struggling to identify the subject, but the Samsung phone figures it out.

The iPhone 14 needs to get food portraits right. As it currently stands, this is a department where the Galaxy S22 Ultra has a solid lead.

Objects

The above four shots suggest that you can click objects in Portrait mode on any of the two Apple or Samsung offerings and get eye-pleasing results. It could also be because the object in the third and fourth photos is in a single plane. Hence, making it easier for both devices to identify the subjects.

But in the first two shots, the ‘Best Seller’ sign isn’t in a singular plane, and the phones still get it right. In all the pictures, the bokeh seems natural, edge detection is on point, and the object in focus is clear. All of what you can want from a portrait image, the two devices deliver when shooting objects.

While the iPhone struggles to identify the food items in the previous section (probably because there are various items on the plate), it clearly doesn’t struggle to identify the subject if it’s a simpler object.

The iPhone 14’s Portrait mode needs improvement

iPhone 13 Pro Max and Galaxy S22 Ultra rear panels.
Prakhar Khanna/Digital Trends

In my experience using both phones, the iPhone loses out to Samsung in edge detection on Portrait shots. After looking through the 12 images above, it’s clear that Apple can work on image processing to improve the Portrait shots of humans and food items. The latter’s edge detection needs to be improved by miles. While Portrait shots of humans are mostly true to the environment, they can get a bit contrasty with overblown shadows. I hope the iPhone 14 Pro delivers better results than the Galaxy S22 Ultra and other flagship smartphones in the market.

It’ll be interesting to see the difference between the image processing algorithms on the iPhone 14 lineup and the iPhone 14 Pro series, especially because the former is said to use last year’s processor. And because of that, you might miss out on any camera algorithm tweaks if you are planning to get the iPhone 14 or iPhone 14 Max. Meanwhile, we expect the major camera improvements to be part of the iPhone 14 Pro and the iPhone 14 Pro Max.

There’s talk of a new 48MP primary camera, in addition to an improved ultra-wide lens. Combined with a next-generation chipset and image processing, the iPhone 14 Pro could be Apple’s biggest leap yet in taking Portrait mode photos to the next level. And considering how lackluster the 13 Pro can be in that regard, I certainly hope it is.

Editors’ Recommendations








READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.