Remix.run Logo
bborud 2 days ago

Computing lens blur ought to be easier to achieve with more modern camera systems that add a LIDAR to capture a depth map. Does, for instance, Apple use their LIDAR system on the iPhone to do this?

klysm a day ago | parent | next [-]

Yeah once you have depth faking t is a lot easier. I find this most interesting from a correction perspective

Analemma_ a day ago | parent | prev [-]

They do, and it has made some noticeable improvements. Compared to when it first came out, "portrait mode" on recent iPhones is a lot less likely to blur individual hairs on a person, or keep the background seen through a lock of hair in focus. But IIRC the iPhone lidar can only distinguish something like 16 depth layers, and at the end of the day the blurring is still computational and "fake", I don't know if it will or can ever reach parity with what a large lens can do.