Remix.run Logo
genewitch 11 days ago

I had two phones with 108MP sensors and while you can zoom in on the resulting image the details are suggestions rather than what I would consider pixels.

Whereas a $1500 Nikon 15MP from 20 years ago is real crisp, and I can put a 300mm lens on it if I want to "zoom in".

Even my old nikon 1 v1 with its cropped sensor 12MP takes "better pictures" than the two 108MP phone cameras.

But there are uses for the pixel density and I enjoyed having 108MP for certain shots, otherwise not using that mode in general.

throwanem 11 days ago | parent [-]

Yeah, that's the exact tradeoff. 108MP (or even whatever the real photosite count is that they're shift-capturing or otherwise trick-shooting to get that number) on a sensor that small is genuinely revolutionary. But only giving that sensor as much light to work with as a matchhead-sized lens can capture for it, there's no way to avoid relying very heavily on the ISP to yield an intelligible image. Again, that does an incredible job for what little it's given to work with - but doing so requires it be what we could fairly call "inventive," with the result that anywhere near 100% zoom, "suggestions" are exactly what you're seeing. The detail is as much computational as "real."

People make much of whatever Samsung it was a couple years back, that got caught copy-pasting a sharper image of Luna into that one shot everyone takes and then gets disappointed with the result because, unlike the real thing, our brain doesn't make the moon seem bigger in pictures. But they all do this and they have for years. I tried taking pictures of some Polistes exclamans wasps with my phone a couple years back, in good bright lighting with a decent CRI (my kitchen, they were houseguests). Now if you image search that species name, you'll see these wasps are quite colorful, with complex markings in shades ranging from bright yellow through orange, "ferruginous" rust-red, and black.

In the light I had in the kitchen, I could see all these colors clearly with my eyes, through the glass of the heated terrarium that was serving as the wasps' temporary enclosure. (They'd shown a distinct propensity for the HVAC registers, and while I find their company congenial, having a dozen fertile females exploring the ductwork might have been a bit much even for me...) But as far as I could get the cameras on this iPhone 13 mini to report, from as close as their shitty minimum working distance allows, these wasps were all solid yellow from the flat of their heart-shaped faces to the tip of their pointy butts. No matter what I did, even pulling a shot into Photoshop to sample pixels and experimentally oversaturate, I couldn't squeeze more than a hint of red out of anything without resorting to hue adjustments, i.e. there is no red there to find.

So all I can conclude is the frigging thing made up a wasp - oh, not in the computer vision, generative AI sense we would mean that now, or even in the Samsung sense that only works for the one subject anyway, but in the sense that even in the most favorable of real-world conditions, it's working from such a total approximation of the actual scene that, unless that scene corresponds closely enough to what the ISP's pipeline was "trained on" by the engineers who design phones' imaging subsystems, the poor hapless thing really can't help but screw it up.

This is why people who complain about discrete cameras' lack of brains are wrongheaded to do so. I see how they get there, but there are some aspects of physics that really can't be replaced by computation, including basically all the ones that matter, and the physical, optical singlemindedness of the discrete camera's sole design focus is what liberates it to excel in that realm. Just as with humans, all cramming a phone in there will do is give the poor thing anxiety.

genewitch 11 days ago | parent [-]

I generally judge a camera by how accurately it can capture sunset, relative to what i actually see. on a samsung galaxy note 20, i can mess with the white balance a bit to get it "pretty close", but tends to clamp color values so the colors are more uniform than they are in real life. I've seen orange dreamsicle, strawberry sherbet, lavender, at the same time, at different intensities in the same section of sky. No phone camera seems to be able to capture that. http://projectftm.com/#noo2qor_GgyU1ofgr0B4jA captured last month. it wasn't so "pastel", it was much more rich. The lightening at the "horizon" is also common with phone cameras, and has been since the iphone 4 and Nexus series of phones. It looks awful and i don't get why people put up with it.

throwanem 10 days ago | parent [-]

I think we see, or more properly perceive although weakly, some higher-order color harmonics that cameras don't capture and displays don't (intentionally) reproduce, and I think the pinky-magenta-purplish region of the gamut might be the easiest place to notice the difference.

I think people mostly put up with it because on the one hand it doesn't matter all that often (sunset is a classic worst-case test for imaging systems!) and, on the other, well, "who are you going to believe? Fifty zillion person-centuries of image engineering and more billions of phones than there are living humans, or your own lyin' eyes?"

genewitch 10 days ago | parent | next [-]

i've wanted a de-bayered sensor camera for a decade and a half; but i'm not willing to pay Red or Arri prices for a real monochrome cine camera. I had an Huawei Honor 8 that had a real-honest-to-goodness monochrome sensor on it. It used it for focusing, but one could take images straight from that sensor. It was around the time that Asus zenfone was using IR Lasers to do focusing, other phones had other depth sensors.

I still have to manually focus (by pushing the screen where i want it to focus), but on newer phones the focus tries to "track" what you touched, which is... why would they change that? I tilt the phone down to interact with it, i know where in the frame i want it to focus, because before i tilted the phone down, i was looking at the frame! Rule of thirds, i can reframe the image to put focus exactly in one of the areas it ought be, zoom in or out, whatever. But no, apparently it has been decided i want the focus to wander around as it sees fit.

I just unplugged the honor 8 to take a picture and apparently the battery is kaput since the last time i used it. Sad day, indeed.

genewitch 9 days ago | parent | prev [-]

got it charged, but not willing to unplug it to test, so a quick shot out the door:

http://projectftm.com/#H-6GJlHgGFA8Yek86MrkVw "Neutral Density" unedited but cropped

throwanem 3 days ago | parent [-]

I'm too much the artist, I find. I tried shooting my D850 in B&W and it's as good as I would expect from a color sensor with a Bayer filter. But it feels like I'm just giving up all the degrees of freedom Lightroom can give me when converting a color raw, and there's a degree of "push" and "pull" processing flexibility that doesn't seem easily replicable by other means.