Remix.run Logo
dagmx 7 days ago

The points really boil down to:

1. Difference in focal length/ position.

2. Difference in color processing

But…the article is fairly weak on both points?

1. It’s unclear why the author is comparing different focal lengths without clarifying what they used. If I use the 24mm equivalent on either my full frame or my iPhone, the perspective will be largely the same modulo some lens correction. Same if I use the 70mm or whatever the focal length is.

2. Color processing is both highly subjective but also completely something you can disable on the phone and the other camera. It’s again, no different between the two.

It’s a poor article because it doesn’t focus on the actual material differences.

The phone will have a smaller sensor. It will have more noise and need to do more to combat it. It won’t have as shallow a depth of field.

The phone will also of course have different ergonomics.

But the things the post focuses on are kind of poor understandings of the differences in what they’re shooting and how their cameras work.

neya 6 days ago | parent | next [-]

I disagree, I thought the article highlighted the differences beautifully. I'm on a professionally color calibrated 27" monitor that came with one of those color calibration "certificates" at the time of purchase. The second I loaded the article, the differences were just stark. The skin tones alone were a dead giveaway.

It is no secret that Apple does a lot of post processing on their mediocre photos to make them look good - more so than most other Androids - because, it's all software. But, from the article, it is understood that the author is trying to point out that Apple could've done a better job to represent skin tones more accurately atleast. The fish-eye defense for Apple is totally understandeable, but, why are we defending the weak skin tones? Every year, they keep launching and claiming grandoise statements "This is the best smartphone camera out there is".

And no, this is not a limitation of smartphone sensors. In fact, if you look at the latest Xperia series from Sony, they have the same software from their DSLRs translated into the smartphones that addresses the skintones perfectly well.

I hope we can skip past the biases and personal preferences we have towards Apple and treat them neutrally like any other manufacturer. This "Apple can do no wrong" narrative and attacking anyone who points out their flaws is just tired and boring at this point.

ksec 6 days ago | parent | next [-]

>more so than most other Androids

It the old days Apple used to somewhat pride themselves with taking more "realistic" photos. While Android had it the other way around and basically post processes a lot of things as well as colouring. Mostly used for Social Media like Instagram.

And then came iPhone X. They started changing the colour of Sky and sharpening a lot of things. To the point of a lot of Photos taken by my camera looks great but also looked fake.

close04 6 days ago | parent | next [-]

> And then came iPhone X

Did the iOS/Android situation actually swap, or was the X an outlier? I have photos from a recent event taken entirely with phones, and the result mirrors my experience for the past many years.

iPhone (11-15 including Pro Max) photos look "normal". Very, very similar to what my eyes saw in terms of colors. Photos taken with Android phones (Pixel 9 Pro XL, recent Oppo or Samsung A series, etc.) look terribly unnatural. The blue of the sky, the green of the plants, the red of the dress, they look "enhanced" and unnatural, nothing like what my eyes saw. I can tell apart almost any iPhone vs. Android picture just by looking at the colors on the same display.

The resolution or sharpness are harder to judge with one look and I wasn't trying to compare quality. But the colors are too obvious to miss.

petre 6 days ago | parent | prev | next [-]

> And then came iPhone X. They started changing the colour of Sky and sharpening a lot of things. To the point of a lot of Photos taken by my camera looks great but also looked fake.

The phone processing is lagely shaped by social media culture. Camera makers also started to incorporate in-camera editing features on vlogger targeted models.

bayindirh 6 days ago | parent [-]

> Camera makers also started to incorporate in-camera editing features on vlogger targeted models.

DSLRs had in camera lighting correction during shooting and post-processing since 2016 or so [0].

[0]: https://www.nikonimgsupport.com/eu/BV_article?articleNo=0000...

petre 6 days ago | parent [-]

This is about skin smoothing anf other stuff. See under filter and self shot:

https://www.panasonic.com/uk/consumer/cameras-camcorders/lum...

Star filter is especially funny since it's used in k-drama opening/closing sequences.

bayindirh 6 days ago | parent | next [-]

G100 is introduced in 2020, and is a mirrorless camera with significantly higher processing power. D-Lighting and other similar post processing stuff came at least half a decade before.

Modern cameras like Nikon Z6/III can also do similar processing on camera during shoots to reduce post-processing load after the shoots to accelerate the production pipeline.

ezconnect 6 days ago | parent | prev [-]

Faces became different on different camera phone. I don't think this will stop. I think it will only get worse and everything will be fake like what Samsung did when you are taking a Moon photo it just outright substitute it with a stock photo since they assume the moon looks the same everywhere humans will use their phone.

chemmail 3 days ago | parent | prev [-]

Its really iPhone 11 when things got crazy with deep fusion. The iPhone X actually has the most detailed screen. When i look at the same photos, somehow the X has better color and details than up iphone 14 max (i have every iphone till 14). If you look at raw image you will be blown away how little has changed since even iphone 4!

But at the end of the day its all about the photographer than the equipment. Just ask chefs. 20 students same kitchen, ingredients, recipe, 20 different tasting dish.

What really drove that in was when my art teacher took a shot of me with my camera and holy shit it was one of the best pics I've seen. He just has the eye at the end of the day.

Karrot_Kream 6 days ago | parent | prev | next [-]

My hunch is that you'll find more fans of Apple's color profile than detractors. This particular shot may have done it badly (to your eyes, some people prefer the more saturated look) but as a whole I have my doubts.

Color profiles vary per body at the least and are variable based on what post processing you do. I can load up Adobe Vivid and it'll look completely different than Adobe Portrait.

Shoot a Canon, Sony, and Fuji in JPG on the same scene (so same focal length and DOF) and auto white balance. Each body will output a different image.

dkga 6 days ago | parent | prev | next [-]

I get the point… but I would counterargue, perhaps facetiously, that if one needs a professionally color calibrated screen to notice the difference, then it is really not something that would matter for mere mortals.

prox 6 days ago | parent | next [-]

Without hyperbole I could give people a badly calibrated CRT from the 90s and it doesn’t matter to some. Some people just don’t see anything wrong with pictures, and don’t even know what to look for or what it’s called.

The inverse are the professional photographers who work with pictures day in and out, they see everything.

Quarrel 6 days ago | parent | next [-]

I am amazed at the eye professional photographers have. A shot of a building that is suddenly really interesting, versus my shot of that building. Colour. Angle. etc.

I just don't have the eye for it, despite having a decent amateur setup.

BUT, yes, lots of people might look at a random photo on their phone and not notice skintones, or the fisheye etc. If you then give them a pile of 10 photos from a pro, versus 10 from an amateurs phone, they'll notice. Particularly if they're blown up a bit on a print or a decent screen.

It might not matter if you are just flicking through 20 shots on your phone, but as the article implies, we have perception of these things, even when it is the subconscious.

throw0101c 6 days ago | parent | next [-]

> I just don't have the eye for it, despite having a decent amateur setup.

Checkout this book by the late† Bryan Peterson, where he shows photos taken by his students as well as his own of the same location, and explains the differences in techniques/settings:

* https://www.goodreads.com/en/book/show/54228164-bryan-peters...

His Understanding series of books are also good (Exposure is worth checking out if you know nothing about camera settings):

* https://www.goodreads.com/author/show/82078.Bryan_Peterson

† April 2025: https://www.crottyfh.com/obituaries/bryan-peterson

lb1lf 6 days ago | parent | next [-]

Oh, I am saddened to hear of Peterson's passing.

Way back when, I bought a couple of his books, probably on the recommendation of someone or other in an online photography forum - 'Understanding Exposure' and 'Learning to see creatively'. The latter in particular was wonderful for someone who had the technical aspects of photography more or less sorted, but was - ahem - deficient in the artistic department.

Anyway, I felt his style was incredible - down-to-earth, but not afraid to go into a bit of background if needed - so I sent off a brief letter of thanks through his publisher.

Lo and behold, got a very nice letter back, thanking my for the kind words and encouraging me (I had mentioned that I shot both film and digital, seeing as at the time, a wonderful film camera like the F5 could be had for a fraction of what even an entry-level APS-C DSLR cost) to experiment A LOT using the DSLR, as the instant feedback provided would help my analog hit rate progress leaps and bounds.

I was already thinking a bit along those lines, but became a lot more conscious about trying to improve my skills using the DSLR upon his encouragement - and my photos improved a lot over the following years as a result.

Thanks, Bryan.

Quarrel 6 days ago | parent | prev [-]

Thanks, that looks great.

amelius 6 days ago | parent | prev [-]

Perhaps the amateurs have an internal network in their brain that corrects for badly shot photos.

The professionals have learned to shut off that network.

1718627440 6 days ago | parent | prev [-]

I mean we are used to completely different colour profiles in reality also. You don't perceive the colors compared to the room light, you perceive them relative to the other colors on the screen, so the screen doesn't really matter.

josephg 6 days ago | parent | prev | next [-]

Do those photos look similar to you? Those color differences are huge to me. And some of the stylistic choices the image processing has made make them look like photos of different people.

ubercore 6 days ago | parent | prev [-]

I don't think the point was to say you need the calibrated monitor to notice, rather that it's _even more stark_, and clearly points to the issues raised in the article.

And to be fair, the thrust of the article was "Why don't you see printed and frame iPhone photos", and these things that might be a bit subtle on an un-calibrated screen are going to be a big deal if you professionally print it.

dagmx 6 days ago | parent | prev | next [-]

You’re somehow both reading far too in to my comment (none of my comment is specific to Apple) and not reading my comment enough (because you m missed the point about color profiles)

I’m not defending the default color choices, I’m saying that they’re comparing apples to oranges because they’re comparing an output designed to be opinionated with one that’s designed to be processed after the fact. The iPhone is perfectly capable of outputting neutral images and raw files.

ubercow13 6 days ago | parent [-]

The non-iPhone pictures are probably also in-camera jpegs so they are also 'opinionated', not RAWs.

dagmx 6 days ago | parent | next [-]

You don’t need to shoot RAW to have neutral images. In Camera JPEGS are still defaulted on most cameras to be as neutral as possible unless you opt for a different picture style.

This is the opposite default to phones where the defaults are to be punchier, but where you can still select a neutral profile.

The argument is basically comparing defaults and claiming it’s endemic to the format.

Ancapistani 6 days ago | parent | prev [-]

I would go so far as to say: if you're using in-camera JPEGs, you would probably be better off with a cellphone.

shakow 6 days ago | parent | next [-]

That's a very contemptuous thing to say.

Even if one is using in-camera JPEG and does not want to spend 1hr/picture in Darktable, they can still play with many more objectives, exposure, shutter time, physical zoom, aperture, etc.

I'd even go the other way around: if you just bought a camera, just use in-camera JPEGs for the first months and familiarize yourself with all the rest (positioning, framing, understanding your camera settings, etc.) before jumping into digital development.

barnabee 6 days ago | parent | next [-]

Totally agree!

Photography for me is about the physical and optical side of things. Choosing a lens for a situation, framing the shot, aperture, shutter, etc.

When I switched to digital I was seduced by post-processing, partly as a substitute for the look I could achieve with different films, but mostly I suspect because all those sliders and numbers and graphs are naturally attractive to a certain type of person :)

I eventually pretty much stopped taking photos.

Changing my workflow from post processing RAW photos (and barely ever looking at them again) to using in-camera JPEGs that I can immediately share, print, or whatever was enough to start me taking photos again regularly as a hobby.

More unexpectedly, in addition to the obvious time saving of removing the post processing step (aside from occasional cropping), the satisfaction benefit of the immediacy with which I can now print, share, display, etc. my favourite photos has been huge. It’s so much more rewarding getting photos right after you took them and actually doing something with them!

Now I’m not even sure I’d call all that digital image processing “photography”. Sure, it’s an art in its own right, and one some photographers enjoy, but the essence of photography lies somewhere else. I’d encourage everyone to try a camera with decent in camera JPEG production. You can always shoot Raw+JPEG if you’re scared to go full cold turkey.

Ancapistani 6 days ago | parent | prev | next [-]

> That's a very contemptuous thing to say.

I really don't think it is.

When I pick up a camera, my intent is one of two things: the experience of photography itself, or the best quality I can reasonably obtain. Neither of those goals are attained with a smartphone.

Every other time I take a photo, it's with a smartphone. It's easily good enough for the vast majority of use cases.

> Even if one is using in-camera JPEG and does not want to spend 1hr/picture in Darktable,

That's... absurd. Granted I lean toward a more "street photography" style, but it's exceptionally rare that I spend more then ~30s on a photo in Lightroom. Most of that time is spent cropping. White balance, exposure correction, etc. are all done in bulk.

> they can still play with many more objectives, exposure, shutter time, physical zoom, aperture, etc.

Sure - and why wouldn't you want to play with RAW as well? It's not like the profile the camera would have used isn't embedded in the RAW anyhow.

> I'd even go the other way around: if you just bought a camera, just use in-camera JPEGs for the first months and familiarize yourself with all the rest (positioning, framing, understanding your camera settings, etc.) before jumping into digital development.

I don't disagree with this at all. Of course there are edge cases; that's why I said "probably".

To put it another way: if you're shooting JPEGs regularly, you're almost certainly not doing it for the craft. There are very few reasons I can think of to choose a traditional camera if you're not going to take advantage of the improvements in ISO and dynamic range that it offers - and those are two things you give up[0] shooting JPEG.

0: You give up ISO in that you are discarding much of the information that you could use to push/pull process, which is very often preferable to very high ISO.

ETA: I just looked it up. In 2024, I kept 767 photos from my iPhone and 1,900 from my cameras. That includes multiple performances of my wife's dance studio, so the latter is heavily skewed by that. Excluding those, I kept 376. In other words, I appear to be taking my own advice here.

ubercow13 6 days ago | parent | next [-]

>and those are two things you give up[0] shooting JPEG

No you don't? Good in camera JPEGs will utilise push-pull processing, exposing for maximal dynamic range all for you. You don't lose the advantages of the better optics and sensor just because the JPEG is produced in camera.

Ancapistani 6 days ago | parent | next [-]

How would the camera know if you're exposing two stops below your intended EV because you plan to push it in post or if that _is_ your intended EV?

Furthermore, JPEG supports ~8 stops of dynamic range while my X-Pro3's raw files support ~14 stops. You lose almost half your total DR when you shoot JPEG (with that camera).

ubercow13 6 days ago | parent [-]

Because some will choose the exposure and decide when to underexpose and push for you, eg fuji DR feature. You choose your intended EV for the image and it chooses whether to underexpose and push based on the dynamic range of the scene.

>You lose almost half your total DR when you shoot JPEG

No because the camera is applying a tone curve that compresses that DR when producing the JPEG. You lose precision, not DR, but if you don't intend to process the image further it doesn't matter much.

justincormack 6 days ago | parent | prev [-]

That should be configurable - my camera has 3 dynamic range settings, and I almost always use the narrowest one.

shakow 6 days ago | parent | prev | next [-]

All that you said is perfectly valid for your usecase. But you can't just make your use case a generality.

Some people have a camera because they want to take better pictures than their smartphone but don't want to bother with post-processing, some have tried manual processing and found that the work/result balance was not doing it for them, some think that JPEGs look perfectly fine, some just don't have the time to do the processing... there are myriads of reason for which people would like to land somewhere between “let iOS do it” and “I systematically chose my ISO according to this Darktable script I developed these last years”.

dotancohen 6 days ago | parent | prev [-]

  > the best quality I can reasonably obtain.
Cellphones absolutely can produce high quality results. Especially if you add the constraint "best quality I can reasonably obtain" as many consider carrying a dedicated camera all the time to not be reasonable. And this was the case even before the advent of the smartphone. How many people did you see carrying a camera in 1980, or 1990, or 2000? Almost zero.

The best camera, is the camera you have on you.

6 days ago | parent | prev [-]
[deleted]
necovek 6 days ago | parent | prev [-]

That's a pretty generic statement, considering how variable are "in-camera JPEGs" depending on camera and generation.

But even so, most are tuned to natural colours, and there is no beating low depth of field for bokeh/subject separation.

sharpshadow 6 days ago | parent | prev | next [-]

If anyone working on skin color representation try to emulate Agfa Precisa it has with the best skin colors in natural light.

nikhizzle 6 days ago | parent | next [-]

Thanks for this, this is very useful for a related ai project I’m working on.

dotancohen 6 days ago | parent | prev [-]

What about other colours? Isn't that film famously under saturated for sky, water, vegetation, etc?

sharpshadow 6 days ago | parent [-]

It very might be and it shouldn’t be easy to find good training pictures in the sea of amateur shoots online with different development techniques and light conditions, the last batch was also produced in 2005. Some previous generation of that film line might have similar skin tones. I would think that with modern techniques it isn’t necessary to apply the emulation to the whole image but to just the parts desired?

hopelite 6 days ago | parent | prev | next [-]

You and most Apple neggers are not really any better by ignoring what all this comes down to, choices and trade-offs. Apple’s primary objective is clearly related to taking photos that provide a positive impact on the user while being as easy to do so as feasible, not accuracy of the image. They likely care more about the most number of users being satisfied, not accurate reproduction of an image.

I would not be surprised if they don’t actually want accuracy in imaging at all, they want a positive impact on the user, and most people don’t want reality. If that means causing “hotdog skin” under some conditions or with some skin tones, or maybe even if most users prefer “hotdog skin”, while having an overall positive photo outcome for most other users; they will likely always choose to produce “hotdog skin”. They are also serving a far greater and, frankly an increasingly less light skinned audience than most understand. Maybe it’s just an effect of “whites” having given away their control over things as ever more “non-whites” become revert increasingly important and an ever increasing number and percentage of Apple’s users. Do Asians and Africans get “hotdog skin”? I don’t know the answer to that.

It is the narrow minded perspective of DSLR purist types that this stuff bothers, largely because they cannot look beyond the rim of their plate. Some platforms are for accuracy, others for impact and user experience.

People should maybe consider stop saying things like “this Apple is an absolutely horrible, awful, no good orange!”

hopelite 6 days ago | parent | prev [-]

You and most Apple obsessed curmudgeons are not really any better by ignoring what all this comes down to, choices and trade-offs. Apple’s primary objective is clearly related to taking photos that provide a positive impact on the user while being as easy to do so, not accuracy. They want the most number of users to be satisfied, not accurate reproduction of an image. I would not be surprised if they don’t actually want accuracy in imaging, they want a positive impact on the user, and if that means causing “hotdog skin” under some conditions or with some skin tones, while having an overall positive photo outcome for most other users, they will always choose to make you have “hotdog skin”. They are in serving a far greater audience than most understand.

It is the narrow minded perspective of DSLR purist types that this stuff bothers, largely because they cannot look beyond the rim of their plate.

You may want to stop saying “this Apple is a horrible, awful, no good orange.”

sheiyei 6 days ago | parent [-]

Could say, the goal is an OK photo every time, even if it means you only get really good photos by accident.

majormajor 6 days ago | parent | prev | next [-]

The biggest real differences between iPhone and whatever ye-olde-good-standalone-digital-camera are sharpening/edge enhancements and flattening of lighting.

If you take a lot of landscapes with detailed textures in high-contrast lighting you'll see the differences pretty quickly.

The iPhone photos will look better at first glance because they have a lot of tricks to deal with lighting that would otherwise give a photographer difficulty. For instance, that shot of the child could easily have a completely blown-out background in slightly different circumstances for a typical use of a digital camera's auto-exposure mode. But it results in a certain look that this article really doesn't show well, in terms of the more fake-looking aspects of it. The gravel in the shot of the child hints at it, and you can start to see it more if you view the image full-size vs the scaled down presentation. The asphalt under the car, too - there's something very harsh and fake about the iPhone texture rendering approach that gets worse the larger you display the image. This started around the iPhone 11, IIRC, with it's ML processing.

Both things can be avoided with Halide's raw mode (more "raw" than Apple's) if you want side by side comparisons on your own device. Though IIRC it doesn't support full-res on the newer phones.

The trick, though, is that if you want images that look better in tough conditions, there's a learning curve for using a standalone camera or to shooting in RAW with Halide. In terms of lighting it's not even "more realistic" right out of the gate, necessarily, because your eye has more dynamic range and your brain has more tricks than most any straight-out-of-camera non-ML-enhanced image.

But if you want images you can print out at 8x10+ you'll benefit from the investment.

(Samsung cameras are even wilder in their over-enhancement of photos.)

sundvor 6 days ago | parent [-]

Yeah I like to take photos of my cast iron cooking with my S25U, on a black induction glass surface - and I find myself swapping to Pro mode all the time as the colour temperature is often way too warm and or oversaturated.

It's a great camera in automatic mode most of the time, but not for that scenario.

sazylusan 7 days ago | parent | prev | next [-]

Agreed, in particular the distortion of the players on the ends, the smaller shoulders and chest, as well as the lean can all be attributed to the wider lens used on the iPhone (and as such that the photo was taken closer to players). I'd guess the author was using the "1x" lens on the iPhone, a lot of these issues go away if they use the "3x" or "5x" lens. I'd even consider that most of the jawline change of the player is simply the angle of their chin/face as well as expression.

Joeri 6 days ago | parent [-]

The 2x mode of the wide lens is basically the standard “nifty fifty” of a big camera and what the author should have compared to. The 1x is 24mm equivalent which is a focal length I don’t particularly care for, but I get why they picked it (easy to frame a group of people indoors).

For portraits the ideal length is 85mm equivalent which would be 3.5x, rumored to be on the next iphone pro. At this length there is minimal facial feature distortion without getting the flattening effect you get at longer focal lengths.

rob74 6 days ago | parent | prev | next [-]

Yup, that was the thing that jumped out at me too: in the photos with the golf players, the trees in the background appear much smaller in the iPhone photo than in the "real camera" photo, which means the "real camera" photo was taken from further away and zoomed in, so it obviously will have less distortion. Same for the building and car pictures, but the article doesn't mention that at all (except for writing that "the fish eye iPhone lens creates distortion" - of course it does, that's why the iPhone has other lenses as well)!

01HNNWZ0MV43FF 6 days ago | parent [-]

Yeah it's disappointing to see photographers getting this wrong. Most of them know better.

It's the _distance_ that causes distortion, not the _lens_. You can prove this by doodling light rays on a sheet of paper. There is no lens that will get you a good photo at 1 meter from a person. They stand back 2 or 3 or more and then say "ho ho fish eye lens". I'm so sick of it

Someone agrees: https://petapixel.com/2021/08/02/lenses-dont-cause-perspecti...

bayindirh 6 days ago | parent | prev | next [-]

I'll kindly disagree with you. Like the other commenter, I'm on a 27" HP business monitor comes with color calibration certificate, and the differences are very visible. Moreover, I'm taking photos as a hobby for some time.

The angle, different focal lengths doesn't matter in rendering of the images. The issue is, cameras on phones are not for taking a photo of what you see, but a way to share your life, and sharing your life in a more glamorous way is to get liked around people. Moreover, we want to be liked as human beings, it's in our nature.

So, phone companies driven by both smaller sensors (that thing is way noisier when compared to a full frame sensor) and market pressure to reduce processing needed to be done by end users (because it inconveniences them), started to add more and more complicated post-processing in their cameras.

The result is this very article. People with their natural complications reduced, skin tones boosted on red parts, sharpened but flatter photos, without much perspective correction and sometimes looking very artificial.

Make no mistake, "professional" cameras also post process, but you can both see this processing and turn it off if you want, and the professional cameras corrects what lens fails at, but smartphones, incl. iPhone makes "happy, social media ready" photos by default.

As, again other commenter said, it's not a limitation of the sensor (sans the noise). Sony supplies most of the higher end sensors in the market, and their cameras or other cameras sporting sensors produced by them got the "best color" awards over and over again, and XPeria smartphones comes with professional camera pipelines after that small sensor, so they can take photos like what you see.

I personally prefer iPhone as my smartphone of my choice, but the moment I want to take a photo I want to spend time composing, I ditch default camera app and use Halide, because that thing can bypass Apple's post-processing, and even can apply none if you want.

lonelyasacloud 6 days ago | parent | next [-]

> The issue is, cameras on phones are not for taking a photo of what you see, but a way to share your life, and sharing your life in a more glamorous way is to get liked around people.

Is nothing new.

When film was mass market almost no one developed their own photos (particularly colo(u)r). Instead almost all printing went through bulk labs who optimised for what people wanting to show to their family and friends.

What is different now is if someone cares about post processing to try and present their particular version of reality they can do it easily without the cost and inconvenience of having to setup and run a darkroom.

bayindirh 6 days ago | parent [-]

Personally coming from the film era, I don't think it's as clear cut as this.

Many of the post-processing an informed person does on a digital photo is an emulation of a process rooted in a darkroom, yes.

On the other hand, some of the things cameras automatically does, e.g.: Skin color homogenization, selective object sharpening, body "aesthetic" enhancements, hallucinating the text which the lens can't resolve, etc. are not darkroom born methods, and they alter reality to the point of manipulation.

In film days, what we had as a run of the mill photographer was the selection of the film, and asking the lab "can you increase the saturation a bit, if possible". Even if you had your darkroom at home, you won't be able to selectively modify body proportions while keeping the details around untouched with the help of advanced image modification algorithms.

twoWhlsGud 6 days ago | parent | next [-]

Which is one reason why I often still shoot with an actual camera and sometimes even with film. I have a lifetime of experience with common film emulsions and a couple of decades of shooting with digital sensors with limited post processing.

When does that matter? It matters when I take pictures to remember what a moment was like. In particular, what the light was doing with the people or landscape at that point in time.

It's not so much that the familiar photographic workflows are more accurate, but they are more deterministic and I understand what they mean and how they filter those moments.

I still use my phone (easy has a quality of its own) but I find that it gives me a choice of either an opinionated workflow that overwhelms the actual moment (trying to make all moments the same moment) or a complex workflow that leaves me having to make the choices (and thus work) I do with a traditional camera but with much poorer starting material.

lonelyasacloud 6 days ago | parent | prev [-]

If a professional had access to darkroom facilities pretty much everything could be done in there right down to removing people and background objects (see for instance https://rarehistoricalphotos.com/stalin-photo-manipulation-1...).

It's just far easier for anyone to do now.

bayindirh 6 days ago | parent [-]

I know. You can remove people, use selective exposure with dodge and burn, etc.

But you can't change a person's proportions and skin tones so precisely unless you're printing small and have time, equipment and talent to paint the positive (negative would be nigh impossible) in a believable manner to do so.

On the otter hand, you can enable slimming filter on your phone or camera, click and you go.

Oh, also, some of the Soviet photo manipulation is done with stolen and translated French software, in 1987 [0].

[0]: https://tech.slashdot.org/story/10/11/04/1821236/soviet-imag...

tristor 6 days ago | parent | prev [-]

And this is why I have an Olympus Pen-F in one pocket and my iPhone in the other pocket. I love my iPhone, and I use it for taking snapshots day-to-day like receipts for my expense report, but any time I care about capturing something I see I have an actual camera in my pocket. Micro 4/3rds for size/weight, unfortunately, but while I have a FF camera I am not lugging it around with me everywhere, a Pen-F literally fits in my pocket with lens attached.

mattwilsonn888 6 days ago | parent | prev | next [-]

You're completely off base on the focal length argument.

A traditional camera has the choice and can choose the most appropriate length; an Iphone is locked in to a fish-eye clearly put in there to overcome its inherent limitations.

So it doesn't really matter "if it's fair" or not, because it's not about a fair comparison, it's a demonstration that a traditional camera is just better. Why should the traditional camera use an inappropriate focal length just because the Iphone is forced to?

Twisell 6 days ago | parent | next [-]

Every hardware have it's limitations, my DSLR don't fit in my pocket for instance. But that wouldn't be a fair point when comparing photo quality against a smartphone.

Comparing quality with non equivalent focal lengths is as pertinent as to mount a fisheye on the DSLR (because you can!) and then claim that the smartphone have less distortion.

josephg 6 days ago | parent | next [-]

> Comparing quality with non equivalent focal lengths is as pertinent as to mount a fisheye on the DSLR (because you can!) and then claim that the smartphone have less distortion.

I was about to disagree with you - but I think you're right. The photographer clearly took a couple steps back when they took the DSLR photo. You can tell by looking at the trees in the background - they appear much bigger in the DSLR photo because they're using a longer focal length.

I think a DSLR would struggle with the same perspective distortion if you put an ultrawide lens on it. It would have been a much more fair comparison if they took both photos from the same spot and zoomed in with the iphone.

arghwhat 6 days ago | parent [-]

I'd agree if the phone had an appropriate focal length, but it doesn't. You can either go way too wide, or way too narrow (with a worse image sensor at that). Comparing the best the phone can sensibly do while handicapping the camera by intentionally doing the wrong thing for the situation makes no sense.

The only workaround for the phone would be to still step back and take the image with the 24mm equivalent, then crop the image a whole lot to get an appropriate and equivalent view.

> I think a DSLR would struggle with the same perspective distortion if you put an ultrawide lens on it.

Note that "proper" lenses have more room for corrective elements in their lens stacks, so decent quality setups should experience less distortion than the tiny smartphone pancakes.

An ultrawide will never be good though, it's a compromise for making things fit or making a specific aesthetic.

dagmx 6 days ago | parent [-]

How do you know if the phone doesn’t have an appropriate focal length if the image isn’t marked?

Secondly, none of the points in the article are about optical distortion across the lens they’re all about perspective distortion. Corrective elements aren’t going to change that. None of the examples highlight barrel/pincushion distortion or the like as an offender.

arghwhat 5 days ago | parent [-]

> How do you know if the phone doesn’t have an appropriate focal length if the image isn’t marked?

I listed focal lengths for an iPhone 16 Pro Max, and good focal lengths for flattering portrait photography is common photography knowledge that I provided as well for reference, set with a little wiggle-room by the optics of the human eye.

There aren't more variables than subject framing and focal length, and a portrait of a person is a well-known size. Comparing the remaining two numbers is simple math.

(It is well-known what distortion effect using the "wrong" focal length will give, which can sometimes be used intentionally but is not what you want in the average portrait. Shorter focal lengths give a silly, elongated facial appearance which exaggerates frontal features like nose and mouth, longer focal lengths give a flatter appearance which exaggerates rear features like neck width.)

> Secondly, none of the points in the article

The article clearly shows distorted images, and that the article fails to mention it does not make it less of an issue.

DiogenesKynikos 6 days ago | parent | prev [-]

The article is comparing photo quality between two different cameras. The lens affects image quality, so it's completely fair to discuss.

If it were possible to switch out the lens on the iPhone, and the photographer had just chosen the wrong lens for the job, that would be a fair criticism of the article. But that's not what happened. The iPhone is just very limited when it comes to the lens, compared to a DSLR.

josephg 6 days ago | parent | next [-]

> If it were possible to switch out the lens on the iPhone...

It is possible to "switch out the lens" on an iphone, because iPhones ship with multiple camera lenses. (Well, multiple entire cameras). The iphone 16 they're using here has 3 cameras. And yet, I'm pretty sure the photo of the boys was taken with the ultrawide for some reason. A lot of the distortion problems would go away if they took a few steps back and used one of the longer lenses - just like they did with the DSLR.

labcomputer 6 days ago | parent | prev [-]

You can always crop a wide shot.

Most of the criticism comes down to not standing in the same spot for both photos (I’m unconvinced that the difference in jawlines, for example, is not because the subjects moved while the photographer did).

You can take a bad picture with any camera.

dagmx 6 days ago | parent | prev | next [-]

I’m sorry, if you’re going to argue it’s completely off base at least make a statement that isn’t easily dismissed by looking at the back of a phone.

My iPhone pro has 3 lenses of 15,24 and 77mm equivalents. This is far fewer than many Android phones.

Even the cheapest iPhone 16E has a super sampling sensor which allows a cropped 50mm equivalent. (And yes that’s a digital crop but that’s why I mention a super sampling sensor)

So yes, unless they were shooting on a budget phone or a much older iPhone, they have a choice of focal lengths that would better match whatever camera they’re comparing to.

vladvasiliu 6 days ago | parent | prev | next [-]

I'm not sure I get your point.

Most appropriate length for what? Some iPhones have multiple focal lengths, just like some "real camers" have fixed lenses with a fixed focal length (Fuxji x100 and the medium-format one whose model I can't remember, Leica something-or-other, Sony R1).

Plus, for what is a traditional camera "just better"? It's highly usage dependent.

I have both a bludgeon, which can be used as an interchangeable lens camera, and an iPhone. The first doesn't fit in my pocket, so sometimes the latter is the one I grab, since it's "better" for that specific use case.

arghwhat 6 days ago | parent [-]

> Most appropriate length for what? Some iPhones have multiple focal lengths, ...

Most appropriate length for portrait photography is well established to be somewhere between 50mm and 100mm (35mm equivalent). The lower end is often considered more "natural" for such photo type, while the longer focal lengths are considered more flattering.

An iPhone 16 Pro Max has three focal lengths, 12, 24 and 120 (35mm equivalent). The first two are much too short unless significantly cropped, and the last one is excessive and requires stepping way back and has the worst image sensor and likely worst compromise of a lens - a lot of lens chonk is elements to manage chromatic aberration and distortion, which smartphone lenses have no room for.

> ... just like some "real camers" have fixed lenses with a fixed focal length (Fuxji x100 and the medium-format one whose model I can't remember, Leica something-or-other, Sony R1).

People using fixed lenses do so because they prioritize a particular type of image or style, and decided to get an even better (and lighter) lens for that instead of carrying around a compromise they don't need.

> Plus, for what is a traditional camera "just better"?

When it comes to getting the best picture, a chonky camera always wins - although they have had some catch-up to do on the software side, physics and our current technical limitations do not care about pocketability.

But a less perfect picture is better than no picture because you left the "real camera" at home. The best camera is the one you have on you.

(Also note that this is not binary between a smarpthone and an Olympic DSLR setup. Good compact cameras with collapsing lenses and mirrorless with smaller lenses are a middleground.)

Terretta 6 days ago | parent [-]

Really appreciated this comment, written as if by a photographer who also happens to use a mobile phone camera for EDL or street. Adding some similar color...

> An iPhone 16 Pro Max has three focal lengths, 12, 24 and 120 (35mm equivalent). The first two are much too short unless significantly cropped, and the last one is excessive and requires stepping way back and has the worst image sensor and likely worst compromise of a lens

This point contains one part of the solution: Zoom with your feet.

Back up your shooting position to where you'd shoot an 85mm or 105mm, take the shot with the 2x lens, then crop. (Unless there's tons of light, then the 5x and hold very still. Even then, shoot both 2x and 5x and compare. Next year's phone should update the 5x sensor as well.)

For the color problems the article highlights, shoot RAW and adjust in a raw development app. Otherwise, shoot using the new grid-based styles to make in-phone development adjustable later. Or use a different app – see below.

For the bokeh, consider shooting in portrait, with aperture dialed to a full-frame DSLR level of (granted fake) bokeh. This remains adjustable after the shot so it's safer to leave active than one might think.

Consider avoiding the iPhone's built-in camera app, consider shooting with an app that can skip the processing pipeline, like Lux's Halide, with “Process Zero” mode:

https://halide.cam/

https://www.lux.camera/introducing-process-zero-for-iphone/

The bottom line, of course:

> The best camera is the one you have on you.

Assign the iPhone 16's shutter button to Halide or ProCamera or one of the newer contenders to shoot everything.

Then to best enjoy your results, never shoot with a full frame using big glass and compare.

kqr 6 days ago | parent | prev [-]

Yes and no. Modern phone cameras are strong enough that you can crop out the centre and get a passable image as if taken with a longer focal length.

lambdasquirrel 7 days ago | parent | prev | next [-]

This does not address the detrimental parts of computational photography.

dangus 7 days ago | parent [-]

Which I’m personally failing to witness consistently by the “evidence” in this article.

Most of the photo examples here were somewhere between “I can’t tell a significant difference” and “flip a coin and you might find people who prefer the iPhone result more.”

Even less of a difference when they’re printed out and put in a 5x7” frame.

Keep in mind the cost of a smartphone camera is $0. You already own one. You were going to buy a smartphone anyway for other things. So if we are going to sit and argue about quality we still have to figure out what dollar value these differences are worth to people.

And the “evidence” is supposedly that people aren’t getting their phone photos printed out. But let’s not forget the fact that you literally couldn’t see your film photos without printing them when we were using film cameras.

Derbasti 6 days ago | parent | next [-]

> Keep in mind the cost of a smartphone camera is $0.

Many people buy a more expensive smartphone specifically for the better camera module. These are expensive devices! It's good marketing that you perceive that as "free", but in reality, I spend way less money on my fancy camera (new models every five years), than my iPhone-loving friends on their annual upgrades.

_tik_ 6 days ago | parent | prev | next [-]

I can see a noticeable loss of detail in the iPhone sample photos. Personally, I prefer cameras that prioritize capturing more detail over simply producing visually pleasing images. Detailed photos offer much more flexibility for post-processing.

chongli 6 days ago | parent | prev [-]

The problem with computational photography is that it uses software to make photos "look good" for everyday users. That may be an advantage for those users but it is basically a non-starter for a photographer because it makes it a crapshoot to take photos which predictably and faithfully render the scene.

askbjoernhansen 6 days ago | parent | next [-]

Lots of apps gives you other options for how to process the image data.

I've had a bunch of "high-end" digital SLRs and they (and the software processing the raw files) do plenty computational processing as well.

I completely agree that all else being equal it's possible to get photos with better technical quality from a big sensor, big lens, big raw file; but this article is more an example of "if you take sloppy photos with your phone camera you get sloppy photos".

dkga 6 days ago | parent | prev | next [-]

This made me ask, is there a (perhaps Swift) API to get the raw pixels coming in from the camera, if there is such a thing? I mean, before any processing, etc.

Narew 6 days ago | parent [-]

There is. If you use lightroom app for example you can have access to raw pixel. But I'm not sure there is a way to get all the images the camera app from the iphone take. Phone don't take one shot to create the final image. they take hundred of shot and combine them.

ChrisGreenHeur 6 days ago | parent | prev [-]

Your brain also uses software to make what you see look good

ksec 6 days ago | parent | prev | next [-]

>Color processing is both highly subjective but also completely something you can disable on the phone

How do I disable Colour processing?

subscribed 6 days ago | parent | next [-]

I think you can get RAW on iPhone? I don't own one so I can't confirm.

On my Pixel RAW is also available, even moreso with the non-standard camera software.

mh- 6 days ago | parent [-]

Yes, I shoot in RAW by default most of the time. It can be quickly turned on and off from the stock camera app without even leaving the main screen.

You may have to enable it once in Settings -> Camera -> Formats; I've been using it so long I don't remember what the defaults are. But once you've done that, it's in the top right of the camera app - just tap where it says RAW.

Terretta 6 days ago | parent | prev [-]

https://www.lux.camera/introducing-process-zero-for-iphone/

ksec 6 days ago | parent [-]

Thank You. Didn't know this exist.

SoftTalker 7 days ago | parent | prev | next [-]

Can you really have a 70mm focal length on a phone that is less than 10mm thick? I thought it was simulated by cropping the image from the actual very short focal length.

Espressosaurus 6 days ago | parent | next [-]

Usually it's "FOV equivalent", e.g. scaled to a full frame sensor size. Tiny sensor size means you maybe have a 10mm focal length, but the size of the sensor relative to 10mm makes it the FOV of a 70mm lens on a full frame camera.

You see similar when people are comparing APS-C, micro 4/3, or medium format lenses.

strogonoff 6 days ago | parent [-]

Physics really works out such that the smaller you make camera sensor, the smaller you can make the lens. Full-frame lenses tend to be markedly bigger for equivalent quality compared to, say, APS-C lenses.

However, due to physics there is also no working around the quality issues of a small sensor. Photosites get less light and produce more noise, and automated noise suppression costs detail and sharpness.

I wonder whether tiny lenses of equivalent sharpness and clarity as their larger equivalents would be much more expensive or impossible to produce (sure, less material, but much finer precision required), but it probably doesn’t matter because the tiny sensor already loses enough sharpness that better lenses won’t contribute much.

croemer 6 days ago | parent | next [-]

> Physics really works out such that the smaller you make camera sensor, the smaller you can make the lens.

At some point the wave-like nature of light starts to bite. Can't really go much smaller than a micron per pixel. So a millimeter sized chip gets you 1 megapixel. 50MP mean ~7mm. (back of the envelope caveats apply)

meatmanek 6 days ago | parent | prev [-]

> Full-frame lenses tend to be markedly bigger for equivalent quality compared to, say, APS-C lenses.

Only if you define quality as field of view.

For light-gathering ability and background separation/bokeh, you need a lower f/number on APS-C than on full-frame to be equivalent: A 35mm f/1.2 lens on a 24MP APS-C sensor will take pictures that look nearly identical to a 52.5mm f/1.8 lens on a 24MP full-frame sensor. (Assuming crop factor of 1.5.) Both will have an aperture size of 29.17mm (= 35mm/1.2 = 52.5mm/1.8), will capture a 37.9° x 25.8° FoV.

Almost all important properties of lenses are determined by field of view and the aperture diameter: Amount of light gathered, background blur, diffraction, and weight.

The illumination-per-area on the full-frame sensor will be 2.25x lower, but the area of the sensor is 2.25x larger so it cancels out such that both sensors will receive the same number of photons.

Background blur is determined by aperture diameter, field of view, and the distances to the subject and background. Since the two lenses have the same aperture size and field of view, you'd get the same amount of background blur for a given scene.

For many lenses (particularly telephoto lenses), the size and weight are primarily determined by the size of the front element, which needs to be at least as big as the aperture. For wide-angle lenses, you start needing a front element that's significantly wider than your aperture for geometry reasons -- the subject has to be able to see the aperture through the front element, so that relationship breaks down.

(Also with lenses where focal length << flange distance, you start to need extra optics to project the image back far enough. This can mean that a wide-angle lens can be more complicated to build for APS-C than for full-frame on the same mount. Take for example the Rokinon 16mm f/2 at 710g / 87mm long versus the Nikon AF-D 24mm f/2.8 at 268g and 46mm long. This isn't relevant to phone cameras, since those don't need to fit a moving mirror between the sensor and the lens like SLRs do. Phone camera makers can put the lens exactly as far from the sensor as makes sense for their design.)

Slow telephoto lenses for DSLRs are pretty much the only place where crop sensors have an advantage. DSLR autofocus sensors generally need f/5.6 or better. Thus, for a given field of view, you need a bigger aperture + front element for the full-frame lens than the "equivalent" crop-sensor lens -- e.g. a 300 f/5.6 with its 53.6mm front element is going to be heavier than a 200 f/5.6 with its 35.7mm front element. However, as mentioned above, the 300 f/5.6 on a full-frame camera will gather 2.25x as much light as the 200/5.6 on the APS-C sensor. Mirrorless cameras can typically autofocus with smaller relative apertures. This is why you see Sony selling an f/8 zoom and Canon selling f/11 primes for their mirrorless mounts -- this sort of lens just wasn't possible on DSLRs. On mirrorless, you could have a 300 f/8.4 full-frame lens that would be truly equivalent to the 200mm f/5.6 APS-C lens.

vladvasiliu 6 days ago | parent [-]

You're absolutely right, but what's depressing is how few people understand this. I'd say the problem with your point is that it's too practical and involves people going out and trying to produce some artistic expression.

Most people enjoy chasing measurable specs and don't stop to understand what they're actually doing. So they'll go compare a 4/3 sensor's output at iso x to a full frame sensor at the same iso. They won't stop to think about what they're trying to achieve. If they want the same depth of field, they won't be able to use the same aperture. So, out in the field, something has to give. Either lengthen the exposure or raise the ISO. If we're talking high ISOs, you probably can't shoot much slower, so higher ISO it is. Differences are then much less shocking.

The other extreme is people chasing paper-thin focus, which, I guess, isn't as easy to obtain on smaller sensors. Yet, for some reason, they won't go to a larger format, either...

Espressosaurus 6 days ago | parent | next [-]

It's a market problem. If I could get the lenses I wanted in APS-C format, I'd have an APS-C camera as my main camera. Instead the market has chosen for full frame to be the main place investment is done in, so I get a full frame camera since the APS-C cameras and lenses are second-class citizens (not true for Fuji, true for Canon, Nikon, and Sony).

Medium format explodes the cost and again, the lenses I want aren't even available.

So you go for what you can get, given the marketplace and also given the lens system you have bought into.

I doubt anyone is going wildlife shooting with a large format camera, for example.

tristor 6 days ago | parent [-]

> I doubt anyone is going wildlife shooting with a large format camera, for example.

Not with true large format, but with the new Fuji medium format cameras it's starting to become reasonably possible to do faster work like wildlife at larger format sizes. The main issue remains, which is sensor readout speed, but the technology has gotten so much better that you can get results with things like birds-in-flight that are comparable to a FF DSLR camera from 10 years ago, with MF now, as far as speed, but at 3x-5x the effective resolution.

Cost is still prohibitive though, I recently upgraded and really considered the new Fuji 100MP MF line, but ended up with a Nikon Z8 in the end for wildlife. On my next iteration, I'll probably bite the bullet and go MF. If I could double the resolution and get similar speed, it'd be worth it, IMO. Especially at the sizes I typically print

dagmx 5 days ago | parent [-]

I’d also just add that Fuji has some of the worst autofocus on the market right now. Going between my Fuji and Sony bodies, I realized how much I took my Sony AF for granted.

If Sony would make a MF body, I’d be all in.

strogonoff 4 days ago | parent [-]

I immediately felt the heft after switching to FF, and that while specifically choosing lighter and smaller used primes. Do light & small primes exist for MF at all? Can you realistically casually carry an MF setup with a few lenses, or is that basically a car-only ordeal (and good luck flying commercial with it)?

strogonoff 6 days ago | parent | prev [-]

> The other extreme is people chasing paper-thin focus, which, I guess, isn't as easy to obtain on smaller sensors.

Really? I used a nice longer (maybe 80mm equivalent) lens on an APS-C system a while ago, and it gave very shallow DoF while being much lighter and more practical (cheaper, etc.) compared to what I can find for a full frame. Not going to look up the physics but I was under the impression that shallow DoF is easier on smaller sensors (I don’t mean phone small, just crop small).

vladvasiliu 5 days ago | parent [-]

Sure. But now go compare a FF 80mm with the same aperture as your 50 or similar lens used on the APS-C. The depth of field will be shallower on the FF.

> I was under the impression that shallow DoF is easier on smaller sensors (I don’t mean phone small, just crop small).

It's the reverse. These things are continuous. There's no reason for it to be easier one way, then all of a sudden stop and become increasingly difficult. Otherwise, there would be no need for shenanigans with "portrait mode".

My iPhone 14 pro's main camera is an equivalent 24 mm f/1.78. It has way much more in focus than my m4/3 12/2 (also a 24 mm equivalent).

strogonoff 5 days ago | parent [-]

> But now go compare a FF 80mm with the same aperture as your 50 or similar lens used on the APS-C. The depth of field will be shallower on the FF.

I did not see it becoming shallower after moving to FF, but then I was using different lenses on FF (cheaper older manuals). That probably sums up what I mean.

Sure, when you account for crop and adjust for the same framing then DoF will in fact be deeper on the crop practically. However, even more practically, there are small and affordable f1> lenses made for crops, whereas the brightest 100mm I used on FF is f2.8 or so because going lower they are really big and/or really expensive. So, for equivalent cost/size/weight, you may have easier time getting shallow DoF on a crop if you buy lenses specifically for crops (which admittedly limits your choice of glass).

dagmx 6 days ago | parent | prev | next [-]

I specifically said “equivalent focal length”. Equivalent focal lengths are relative to a 35mm sensor unless otherwise specified, and the actual focal length reduces with sensor size providing the same fov.

By having a tiny sensor, the current iPhone pro has a range of 15-120mm.

6 days ago | parent | prev | next [-]
[deleted]
jeswin 6 days ago | parent | prev [-]

Yes, periscope lenses are fairly common on phones. 10x "optical zoom".

nateroling 6 days ago | parent | prev | next [-]

Looking at the trees in the background of the first photo, it’s clear he’s using a longer focal length on the non-iPhone.

He has some good points, maybe, but in general it’s a pretty naive comparison.

isodev 6 days ago | parent | prev | next [-]

I think the physical parameters of the lenses are negligible compared to the distortions caused by "computational computing" and the colour changes iPhones tend to add to make photos more instagramable by default.

Narew 6 days ago | parent [-]

Some of the distortion shown in the article is call "Volume Anamorphosis". It's a distortion that strongly deform face and person. This deformation is really visible for short focal lens.

Disclaimer: I work for a photo processing software.

geldedus 6 days ago | parent | prev | next [-]

they don't even know what "bokeh" means

dehrmann 6 days ago | parent | prev | next [-]

Does anyone have experience with aftermarket add-on lenses? Theoretically, they can help with the focal length.

wisty 6 days ago | parent | prev [-]

Um, I'm pretty sure a 24mm shot on a full frame camera will look the same as an iPhone shot, but only if you crop the full frame shot (ignoring pixels counts).

Yes, you could get the same photo of the guy in the centre on the iPhone, but only by zooming in and cropping out everything else. I guess if you REALLY wanted you could run back, and zoom in. Better get a tripod to hold it steady since you're zooming in then.

So anyone but an expert will shoot with a much shorter lens when using the iPhone.

This is how crop factors work unless I'm really mistaken.

dagmx 6 days ago | parent [-]

Please note that I specifically said “equivalent focal length”.

A 24mm equivalent will have almost the exact same perspective on any sized sensor, because that’s what equivalent means. It’s a relationship of sensor size to actual focal length.

A 16mm on a 1.5x APs-c is a 24mm equivalent on a 35mm. The iPhones base lens is something like 1.5mm but when related to its sensor, it’s roughly a 24mm equivalent.

There’s no cropping that needs to happen.