Remix.run Logo
curiousObject 4 days ago

If that’s true, maybe it would allow you to put a 10,000 camera array (100x100) on a smartphone, and do interesting things with computational imaging?

bhaney 4 days ago | parent | next [-]

Some rough numbers:

The paper says that reconstructing an actual image from the raw data produced by the sensor takes ~58ms of computation, so doing it for 10,000 sensors would naively take around ten minutes, though I'm sure there's room for optimization and parallelization.

The sensors produce 720x720px images, so a 100x100 array of them would produce 72,000x72,000px images, or ~5 gigapixels. That's a lot of pixels for a smartphone to push around and process and store.

fragmede 4 days ago | parent [-]

72,000*72,000* say, 24 bits per color * 3 colors, equals ~43 GiB per image.

edit: mixed up bits and bytes

bhaney 4 days ago | parent [-]

Careful with your bits vs bytes there

fragmede 4 days ago | parent [-]

edited, thanks!

jajko 4 days ago | parent | prev [-]

Sensor size is super important for resulting quality, that's why pros still lug around huge full frame (even if mirrorless) cameras and not run around with phones. There are other reasons ie speed for sports but lets keep it simple (also speed is affected by data amount processed, which goes back to resolution).

Plus higher resolution sensors have this nasty habit of producing too large files, processing of which slows down given devices compared to smaller, crisper photos and they take much more space, even more so for videos. That's probably why Apple held to 12mpix main camera for so long, there were even 200mpix sensors available around if wanted.