| ▲ | kingsleyopara 3 days ago |
| Surprising there’s no matte-black iPhone 17 Pro - dark, low-reflectance finishes are standard in pro video kit because they minimise specular reflections and stray highlights; keeping a shiny silver finish and skipping a subdued matte black feels like a strange choice and undercuts the “Pro” claim. |
|
| ▲ | linkage 3 days ago | parent | next [-] |
| It's not a strange choice at all when you realize that the majority of people use phone cases and it's more difficult to make matte "pop" in promotional content |
|
| ▲ | runjake 3 days ago | parent | prev | next [-] |
| Movie people don't normally care about the finish of the iPhone they are using. And the ones that do, use a case. I've seen all sorts of non-black (let alone matte black) iPhone rigs used for motion pictures, including white and natural titanium colors. Eg. 28 Years Later used a variety of iPhone configurations and colors. But yeah, I'm surprised there's no black/space gray option this year. Some consumers won't buy any other color. |
|
| ▲ | seanmcdirmid 3 days ago | parent | prev | next [-] |
| I wonder if someone will come up with the idea of vinyl wrap to protect your phone rather than using a slipon phone case. Then...you could have your phone be thin and get that matte finish. Couple that with a matte phone screen protector and I think the result would be pretty nice. |
| |
|
| ▲ | kjkjadksj 3 days ago | parent | prev [-] |
| These have never been actual pro devices. Arguably not even prosumer. You probably don’t want scorched earth ai processing done on your photos as a pro but that is what the iPhones have been doing as of late. Most damning is no way to turn that off. |
| |
| ▲ | astrange 3 days ago | parent | next [-] | | There is no such thing as a digital camera without processing. But third party camera apps can get images as raw as they want them and it supports professional video standards. Try Halide with "Process Zero" if you want that, but I'm pretty sure the most popular 3p camera apps are Asian beauty apps that do far more and far worse quality processing. | | |
| ▲ | kjkjadksj 3 days ago | parent [-] | | Sure there is. Shoot in Raw format. Get a file representing a matrix of the sensor readout for each rgb pixel. Your post processing software of choice handles interpolation to the method of your choice. | | |
| ▲ | astrange 3 days ago | parent | next [-] | | > the sensor readout for each rgb pixel Camera pixels are only one color at a time: GGRR BBGG (quad-Bayer; Fujifilm uses a weirder one called X-Trans. And some of them will be missing because they're damaged or are focus pixels.) And then you still have to do white balance and tone mapping, because your eyes do that and the camera sensor doesn't. | | |
| ▲ | kjkjadksj 3 days ago | parent [-] | | There is a big difference between interpolation (dealing with the bayer or xtrans array and delivering a 3 layer image file in your choice of format and bit depth using your choice of algorithms), shooting for white balance or tone mapping with a color card and calibrated monitor if you care about that level of accuracy, and what Apple is doing which is black box ML subtly yassifying your images and garbling small printed text. Especially when the commenters use case is building out the family archive and not posting selfies on Instagram. | | |
| ▲ | astrange 2 days ago | parent [-] | | > shooting for white balance or tone mapping with a color card and calibrated monitor if you care about that level of accuracy You need to do this if you want to see the image at all, and it involves a lot of subjective choices. The objective auto white balance algorithm usually described is objectively quite bad; for instance it's always described as a single transformation on the image, which doesn't make sense if there are multiple light sources. The reason you'd want to render humans differently in the image is that a) if you don't get skin tones just right they'll look like corpses b) in real life you can choose to focus on a subject in a scene and this will cause them to appear brighter (because your eyes will adapt to them) but in an image there isn't that flexibility and so it helps to guess what the foreground of the image is and expose for that. I forgot to say recent iPhone cameras let you turn off the sharpening effects anyway, just move the photographic style control down to Natural. It is true that the sharpening is kind of bad. This is because someone taught everyone that digital images are bandlimited so they use frequency-based sharpening algorithms, but they aren't, so those just give you ringing artifacts. For some reason nobody knows about warp-sharpen anymore. |
|
| |
| ▲ | snowwrestler 2 days ago | parent | prev [-] | | Which you can do on an iPhone, so I’m not sure what the complaint is. |
|
| |
| ▲ | 3 days ago | parent | prev [-] | | [deleted] |
|