Compared Portrait mode of iPhone 11 and iPhone 13 Pro. Night photography works wonders

iPhone 11 has long held the bar for the most popular smartphone in Russia. Many of the iPhone's fundamental photography capabilities came from it: Deep Fusion, Night Mode, and the ultra-wide-angle camera.

But they all worked with restrictions. Not on all cameras, not under all conditions. Most likely, due to the fact that these are products of the first generations, and the A13 Bionic processor was not completely tailored for them.

The iPhone 13 Pro takes better pictures than any other iPhone. Thanks to the top A15 Bionic chip, all smart camera features and add-ons are activated in any conditions.

There is a reliable way to check how all this well-coordinated work of software and hardware was optimized over two years.

Take a lot of Portraits in a special mode by swiping from right to left in the Camera.


Portrait mode on iPhone 13 Pro

Who doesn’t know what this is: algorithms find the subject, create a depth map and blur the background, just like large full-format systems do.

For the experiment, I captured a maximum of scenarios ( 9 in total ) when people take on this function: in a bar, against the backdrop of bright signs or branches, under artificial lighting, when the bright sun is shining from behind, in pitch darkness, and others.

In some photos the difference is so huge that I have to give a warning.

⚠️ I made any modifications with color, light and blur only in the Photo application and duplicated them, keeping the numerical parameters the same. Everything you see below is a direct difference without trying to make just one photo better or worse.

Original photos are here.

What smartphone cameras can do

iPhone 11 will be released in September 2022. Costs from 49,990 rubles.

Its main camera supports an intelligent Deep Fusion mode, which, in dark conditions, pulls out details by stitching together multiple photos. Night mode is only available on the main camera; it does not work when shooting portraits.

iPhone 11 Dual Camera Features:

Super-angle 0.5x : analog 13 mm, ƒ/2.4, 12 MP, 1/3.4-inch sensor ◍ Main 1x : analog 26 mm, ƒ/1.8 12 MP, 1/2.55-inch sensor.

iPhone 13 Pro released in September 2022. Costs from 99,990 rubles.

All cameras support Deep Fusion and Night photography, including Portrait mode.

Characteristics of the three cameras of the iPhone 13 Pro:

Ultra-angle 0.5x : analog 13 mm, ƒ/1.8, 12 MP, 1/3.4-inch sensor ◍ Main 1x : analog 26 mm, ƒ/1.5 12 MP, 1/1.65-inch sensor ◍ Telephoto 3x : analog 77 mm, ƒ/2.8, 12 MP, 1/3.4 inch sensor

To make the comparison fair, I used only the wide-angle camera on both smartphones. It shows the photo capabilities of both the standard iPhone 13 and the separation from the entire iPhone 11 line, and not just the Pro versions.

Contrasting light in the morning

On the left is the iPhone 13 Pro, on the right is the iPhone 11. The second photo looks flatter. In the photo Rodion

Both smartphones were able to easily separate the background from the model, although the black hood was better “cut out” by the iPhone 13 Pro. He still has a small error on the right side, but it is not noticeable, unlike the result with the iPhone 11, which blurred the left side of the jacket. This oversight is masked by the glare of the sun.

But there is one difference between the pictures.

In mobile photography, algorithms stitch together several photographs with different brightnesses into one, so that both the face and the background are visible in the frame. The technology is called HDR, but specifically in the iPhone it has the Smart HDR prefix (different generations), because it puts emphasis on faces, adding natural volume to them. If even this is not enough, “Studio Light” is built into the Portrait mode, which will complete the necessary shadows.


Shot on iPhone 13 Pro. I circled a bright area that is almost absent in the photo from the iPhone 11

The new generation of Smart HDR 4 in the iPhone 13 Pro has applied a black-and-white pattern to the face so that it appears more voluminous and textured. The backlight was bright on the right side of the head, and the A15 Bionic chip displayed this correctly when the A13 Bionic decided to smooth out the bright part on the face.

This small difference turned a flat photograph into a dynamic story and brought the image closer to the quality produced by full-format cameras.

Don't: Shoot against a light background in direct light.


Mirror, overhead lighting and long hair?
Not my best decision. This is the biggest mistake I've seen people make when taking stage-lit portraits: light backgrounds are incredibly difficult to understand in portrait mode. This includes windows, mirrors, white walls, etc. - anything that reflects light on you is a must. also reflect light onto your iPhone's camera sensor and confuse it. For friends with long hair, the light background also tends to bleed through hair strands in Stage Light mode, creating strange white flecks around a nice matte finish.

Instead, consider a background without direct lighting, as mentioned above. And if you have any overhead lighting, make sure you're shooting in a room with dark walls or background.

Branches. The main problem of the Portrait

Left iPhone 13 Pro, right iPhone 11

But the iPhone 13 Pro cut them out more neatly

Portraits taken against a backdrop of trees have one rule: bring the camera as close to your face as possible.

In the photo above, both smartphones did a poor job, and in the iPhone 11 this is more noticeable if you crop the photo to a close-up frame.

The depth map is correct. The branches above are not blurred as much as the building, and the grass fades into blur gradually.

If you don’t want the same problems, then come closer to the person.

On the left is iPhone 13 Pro, on the right is iPhone 11. Blur F/4.5. In close shots, the separation of hair from trees is much better


Shot on the telephoto (3x) camera of the iPhone 13 Pro. Blur F/4.5. For a similar composition, it was necessary to move further and sit lower

It's immediately clear that both devices made the right separation, although on the iPhone 11 the hair on the right disappeared along with the wall. And the iPhone 13 Pro took brighter photos.

However, even with telephoto, you most likely will not be able to accurately isolate the subject.

Do: Shoot straight


Slight angles (left and middle) are okay, but be careful when shooting in profile or with too much of the body in a weird place. While most other portrait lighting modes make it easy to shoot subjects at an angle, Stage Light requires a single point of focus on the subject to work most effectively. You can angle your body a little, but if your arm is extended behind you, don't expect it to make it into the final frame.

Selfie. Comparison depends on the processor


Shot on the front camera of the iPhone 13 Pro. Blur F/4.5, Edge Light [25]


Shot on the front camera of iPhone 11. Blur F/4.5, Contour light [25]

The front camera is the only one that technically Apple has not changed for 4 years.

But progress is still being made, because the algorithms and the chips on which they work are constantly being improved.

Strangely, the photo from the iPhone 13 Pro, which should have Deep Fusion, looks softer than the photo from the front camera of the iPhone 11, which does not. For some reason, this is exactly the case, and not the other way around, as it turns out that the skin does not look as burnt and contrasting as on the iPhone 11, and large pores are less noticeable.

Smart HDR 4 in the iPhone 13 Pro “pulled out” more of the sky, without reducing it to a white canvas.

Both smartphones coped well with clipping (cutting out the background) until it touched the hair. Here both versions missed some of them on the left.

There is an interesting point in the area with the glasses, where on the left side the distortions take over part of the building. iPhone 11 decided to blur them, iPhone 13 Pro left them unchanged. The optically correct reality is somewhere in the middle, although in this case a photo from the old model looks more natural.

You can: play with Stage Light aberrations


How else could you achieve the greatness of the Cat, Prince of Darkness (right)?
Yes, I know: the point of this guide isn't how to avoid weird lighting? But despite Stage Light's true purpose, this effect (especially Stage Light Mono) can create eerie works of beauty when it least expects it. And what fun is a design aberration if you don't take advantage of it? In this case, the silliness of the Stage Light Mono places the vignette in the surrounding world, with the central light illuminating what it thinks might be the subject. This can appear any time you're not intentionally shooting with Stage Light, but I've had the best luck reproducing it when shooting images that don't clearly show a face.

Indoors during the day iPhone 11 fails

The iPhone 13 Pro consistently shoots brightly, richly, and a little more optically correct.

The window behind you is shown correctly only on the left, because it should be slightly out of focus, but always evenly.

Unlike the current flagship, the iPhone 11 has made part of the frame clear, and because of this, a halo can be seen around the head.

On the left is iPhone 13 Pro, on the right is iPhone 11. Blur F/4.5. The multi-colored background makes it difficult to cut out the background correctly in the second photo

iPhone 11 has jagged edges and much more noise due to half as much light entering the lens

In this example, everything is the same as with branches, but only the old model is affected. Many small objects leave unsightly reflections along the contour of the head. I suppose the new product was helped by LiDAR.

Added to this problem in the iPhone 11 is the lower camera aperture, which causes us to see strong grain and lose detail. The face looks flat, unlike the shot from the iPhone 13 Pro.

Is it possible to remove the blurred background effect from a photo?

Apple offers the ability to remove the blurred background effect from a photo if it turns out to be a bad photo. The corresponding tool (“Portrait”) can be found by clicking on the edit icon.

Indoors in the evening, the iPhone 11 sometimes holds up


Pictured is Masha

After sunset, objects outside the window look more like strange silhouettes, and artificial light can deceive even the real eye.

Both smartphones correctly handled the fact that the T-shirt should be slightly blurred, but the jeans and face would remain equally sharp because they were on the same focusing plane.

It is also correct that the frame is almost in focus, because it is located almost immediately behind the model, and distant branches in blue light need to be blurred much more.

Although both have errors: the iPhone 13 Pro mistook part of the bench for the background in the window, and the iPhone 11, in principle, considered the background to be part of the window and blurred it slightly.

On the left is iPhone 13 Pro, on the right is iPhone 11. Blur F/4.5. iPhone 13 Pro better detects hair


The Portrait mode preview is usually worse than the output version, so you don't have to worry about crooked edges as you shoot.

Left iPhone 13 Pro, right iPhone 11. Blur F/1.4

Above you can see the only photo that I liked better on the iPhone 11. The contrast on Masha’s face and the amount of light look more solid in the example on the right.

It seems that the blur at F/1.4 is too weak, but this is exactly how an “adult” camera with a wide-angle lens would behave. That is, when there is almost no strong blur in the entire frame due to the fact that the object to be focused is far away.

The example with the iPhone 13 Pro is especially realistic, because in the photo from it the yellow light in the distance is more blurred.

On the left is iPhone 13 Pro, on the right is iPhone 11. Blur F/4.5. In the photo is Polina

And we shot this in a completely dark bar. We received two different images.

Even without activating Night Mode, the iPhone 13 Pro collected more light thanks to a larger sensor and a brighter lens. This made less noise.

Most likely, LiDAR correctly recognized objects and allowed the new product to create more realistic blur. The picture from the iPhone 11 looks downright bad, but I would publish the one taken on the iPhone 13 Pro.

Needed: have an overhead light (if you have a dark background)


The original (left) and its Stage Light Mono version (right). Perhaps unsurprisingly, Stage Light works best in environments with good overhead lighting and a dark background. I shot portraits upstairs in a black stairwell with overhead lighting; while both look great, the Stage Light effect adds an extra layer of contextual lighting to the subject's face, giving it a warmer and less washed-out look.

iPhone 13 Pro's gradual blurring of railings and tables is better

The amount of noise on the jacket and scarf in the photo taken with the iPhone 11 is striking. The example on the left is more contrasting, and it is clear that the iPhone 13 Pro felt more confident in these conditions.

But this is an aspect of aperture, and here another part of the frame is much more important.

Portrait mode was around in smartphones until the iPhone 7 Plus, where it debuted in Apple devices. Competitors did poorly because they did not take into account the main detail that formed all the realism.

Natural blur in photography will be gradual: on a longitudinal object (table, wall or board), the focus weakens and intensifies in stages, and its peak occurs in a short section of the image.

Natural blur obtained from the main camera of the iPhone 13 Pro

In this aspect, Apple showed everyone how to do it, but there is a flaw.

Look at an example of natural blur. The photo below was taken with the main camera of the iPhone 13 Pro, and it shows that not only the background is torn, but also the front of the tabletop.

Now take a look at the Portrait mode images captured on the iPhone 11 and iPhone 13 Pro:

Left iPhone 13 Pro, right iPhone 11. Blur F/4.5

In a photo from an older smartphone, “artificiality” gives two points.

Firstly, although truly uneven, the table is blurred in all areas of the photo. At the perspective level of the model's head, it should be naturally clear.

Secondly, from the perspective it is clear that the tree goes out of focus, but the hand located at the same distance from the camera continues to be clear, as if it was cut out and stuck on top of the background. The iPhone 13 Pro does the job more realistically, blurring the fingers slightly closer to F/1.4.

On the iPhone 13 Pro, the tree at F/16 retains its original clarity, as it should be


The texture on the iPhone 11 immediately “dissolves” even at F/16

The difference here, of course, is the presence of LiDAR, which helps create a more accurate depth map.

Even the examples show that this is a small detail, but it is precisely because of these that people feel that something is wrong and the photo was clearly taken on a phone.

Portrait Capabilities

The modern “portrait” function in iPhone 6s and other modifications is capable of:

  • improve the picture thanks to “light capture” effects;
  • take a bright selfie without outside help;
  • create the effect of background depth;
  • “catch” the focus quickly and efficiently.

Using the options for the perfect shot is easy, just follow the simple instructions below.

LiDAR helps with complex objects

On the left is the iPhone 13 Pro, on the right is the iPhone 11. On the left, the steel frame blurs less and gradually, and on the right it immediately looks like a gray spot

In the example above, Masha stepped inside the door frame, which was left without glass. For Portrait mode, these are difficult conditions, because for realism, algorithms must create a depth map, taking into account the fact that not only a person will be in focus, but also a large piece of the image.

The result was two completely different photographs, each with its own problems.

iPhone 11 recognized that the photo was taken from above, so it blurred the legs, but along with it sent the entire length of the gray beams there.

The iPhone 13 Pro realized that almost the entire door should be left in focus, but at the same time, for some reason, it left the left side of the floor and the right side of the wall clear. Both were supposed to be blurry.

In situations where objects are far apart, the result looks natural on iPhone 13 Pro thanks to the additional sensor.

Notice the pillar on the right. In the photo from the iPhone 13 Pro, it is not as blurry as on the iPhone 11. As befits a subject standing closer to the camera, it retains the texture better.

The iPhone 13 Pro also sees large objects better.

The car is pristinely clear in the place where the plane of the head passed, and the iPhone 11 even blurs out the handle above the door and the side glass.

And the picture taken on the new smartphone gave a bright and rich picture. It looks like these parameters were raised in Photoshop compared to the example on the right.

Artificial lighting saves the iPhone 11 in the late evening

On the left is iPhone 13 Pro, on the right is iPhone 11. Blur F/5.6, Contour light [31]. In the photo Ksyusha and Polina


Shot on the telephoto (3x) camera of the iPhone 13 Pro. Blur F/4.5

Again we see a different color, but here we make an error in the fact that the red traffic light on the iPhone 11 changed the white balance settings, so the photo came out a little colder.

This can be understood by the sign on the right, which on the new smartphone is closer to yellow rather than blue. There is also less noise here, and the detail is higher.

By the way, it’s better not to take photographs with telephoto in such conditions. It turns out too soapy even compared to the iPhone 11.

Can: Shoot in low light

Darker backgrounds with ambient lights (top and bottom right) make the scene's matte light look much better than brighter backgrounds and overhead lights (top and bottom left). One of the strangest (and best) tips I have for shooting portraits with stage lighting is to shoot in medium to low light areas. Ideally, you want areas that aren't lit themselves, but have ambient light coming in: the best spot I've found for shooting portraits with Stage Light is in my hallway, opposite the lit kitchen. The kitchen light provides enough contextual light to keep my face bright, but the hallway remains mostly dark; this allows the Stage Light effect to easily separate the subject from the background.

In complete darkness, the iPhone 13 Pro gets a second wind

On the left is iPhone 13 Pro, on the right is iPhone 11. Blur F/1.4. Pictured is Masha

Starting with iPhone 12, Portrait mode supports night photography by pausing the smartphone for a few seconds to capture more light.

We noticed throughout the last paragraphs that in the dark the iPhone 13 Pro consistently produced more realistic colors, but here the difference simply flies into space .

The iPhone 11 can’t cope with anything at all, producing a noisy green result, but the new product paints individual areas in natural shades, gives a clear image and correctly understands what needs to be blurred. Important: there is zero color color here, such a difference immediately comes out of the standard Photo application.

Just so you understand, it was completely dark outside, downright black, and even the glare in the eyes could not be distinguished.

Is it possible to get studio quality photos?

It’s worth thinking not only about how to enable portrait mode on iPhone 8, but also about the additional capabilities of a modern device. It is in this model, as well as in the variation above the iPhone X, that you can use additional effects to achieve studio quality:

  • studio light will help brighten facial features;
  • contour – will create the effect of directional lighting;
  • stage – will highlight the object with “spotlight rays”;
  • mono stage light will help transform everything into black and white.

Experiment with your photos. Save the resulting version. And share your pictures on social networks.

Rating
( 2 ratings, average 4.5 out of 5 )
Did you like the article? Share with friends:
For any suggestions regarding the site: [email protected]
Для любых предложений по сайту: [email protected]