Remix.run Logo
jrockway 7 days ago

The iPhone blurred background is completely synthetic. It uses multiple cameras to build a depth map of the scene, and then blurs whatever isn't at the depth of the subject of the photo.

If you're asking "how do you do", you can select "portrait" when taking the photo, or go to the photo in your gallery after the fact, pick "edit", pick "portrait", and choose a fake aperture ("f/1.4") and focus point to use. The results are ... mid.

can16358p 6 days ago | parent | next [-]

It also fails miserable with long hairs unless it's all tightly tied up.

Though still good to see how it turns out otherwise for a small phone in your pocket.

Melatonic 6 days ago | parent | prev | next [-]

You can also get decent real bokeh by using the main lens and focusing it as close as it can go (place the subject the correct distance) or do the same thing using the 5x telephoto

retinaros 6 days ago | parent | prev [-]

results arent mid they would be equivalent to the beginner photographer in article.

jrockway 6 days ago | parent [-]

I mean, you can try it. I took a random photo in my library, my self and a friend taking a selfie at a museum. I added the blur and there is no way to make both of us sharp while having the background blurred. Our depths are that different (inches), I suppose. At least with a DSLR you could correct this before taking the picture (by adjusting the composition of the shot). Doing it after the fact with a questionable depth map... not as good.

One could take the depth map and use it to mask off two areas to keep sharp, which would potentially have better-than-real-camera results. The iPhone does not do this, however. You'd have to write your own program to do it.