Remix.run Logo
userbinator a day ago

Everything is an interpretation of the data that the camera has to do

What about this? https://news.ycombinator.com/item?id=35107601

mrandish a day ago | parent | next [-]

News agencies like AP have already come up with technical standards and guidelines to technically define 'acceptable' types and degrees of image processing applied to professional photo-journalism.

You can look it up because it's published on the web but IIRC it's generally what you'd expect. It's okay to do whole-image processing where all pixels have the same algorithm applied like the basic brightness, contrast, color, tint, gamma, levels, cropping, scaling, etc filters that have been standard for decades. The usual debayering and color space conversions are also fine. Selectively removing, adding or changing only some pixels or objects is generally not okay for journalistic purposes. Obviously, per-object AI enhancement of the type many mobile phones and social media apps apply by default don't meet such standards.

mgraczyk a day ago | parent | prev | next [-]

I think Samsung was doing what was alleged, but as somebody who was working on state of the art algorithms for camera processing at a competitor while this was happening, this experiment does not prove what is alleged. Gaussian blurring does not remove the information, you can deconvolve and it's possible that Samsung's pre-ML super resolution was essentially the same as inverting a gaussian convolution

userbinator a day ago | parent [-]

If you read the original source article, you'll find this important line:

I downsized it to 170x170 pixels

mgraczyk a day ago | parent [-]

And? What algorithm was used for downsampling? What was the high frequency content of the downsampled imagine after doing a psuedo inverse with upsampling? How closely does it match the Samsung output?

My point is that there IS an experiment which would show that Samsung is doing some nonstandard processing likely involving replacement. The evidence provided is insufficient to show that

Dylan16807 21 hours ago | parent | next [-]

You can upscale a 170x170 image yourself, if you're not familiar with what that looks like. The only high frequency details you have after upscaling are artifacts. This thing pulled real details out of nowhere.

mgraczyk 21 hours ago | parent [-]

That is not true

For example see

https://en.wikipedia.org/wiki/Edge_enhancement

Dylan16807 21 hours ago | parent [-]

That example isn't doing any scaling.

You can try to guess the location of edges to enhance them after upscaling, but it's guessing, and when the source has the detail level of a 170x170 moon photo a big proportion of the guessing will inevitably be wrong.

And in this case it would take a pretty amazing unblur to even get to the point it can start looking for those edges.

mgraczyk 21 hours ago | parent [-]

You're mistaken and the original experiment does not distinguish between classic edge aware upscaling/super resolution vs more problematic replacement

Dylan16807 21 hours ago | parent [-]

I'm mistaken about which part? Let's start here:

You did not link an example of upscaling, the before and after are the same size.

Unsharp filters enhance false edges on almost all images.

If you claim either one of those are wrong, you're being ridiculous.

mgraczyk 17 hours ago | parent [-]

I think if you paste our conversation into ChatGPT it can explain the relevant upsampling algorithms. There are algorithms that will artificially enhance edges in a way that can look like "AI", for example everything done on pixel phones prior to ~2023

And to be clear, everyone including Apple has been doing this since at least 2017

The problem with what Samsung was doing is that it was moon-specific detection and replacement

userbinator 20 hours ago | parent | prev [-]

You have clearly made no attempts to read the original article which has a lot more evidence (or are actively avoiding it), and somehow seem to be defending Samsung voraciously but emptily, so you're not worth arguing with and I'll just leave this here:

I zoomed in on the monitor showing that image and, guess what, again you see slapped on detail, even in the parts I explicitly clipped (made completely 100% white):

mgraczyk 17 hours ago | parent [-]

> somehow seem to be defending Samsung voraciously but emptily

The first words I said were that Samsung probably did this

And you're right that I didn't read the dozens of edits which were added after the original post. I was basing my arguments off everything before the "conclusion section", which it seems the author understands was not actually conclusive.

I agree that the later experiments, particularly the "two moons" experiment were decisive.

Also to be clear, I know that Samsung was doing this, because as I said I worked at a competitor. At the time I did my own tests on Samsung devices because I was also working on moon related image quality

the_af 7 hours ago | parent | prev [-]

Wow.

From one of the comments there:

> When people take a picture on the moon, they want a cool looking picture of the moon, and every time I have take a picture of the moon, on what is a couple of year old phone which had the best camera set up at the time, it looks awful, because the dynamic range and zoom level required is just not at all what smart phones are good at.

> Hence they solved the problem and gave you your picture of the moon. Which is what you wanted, not a scientifically accurate representation of the light being hit by the camera sensor. We had that, it is called 2010.

Where does one draw the line though? This is a kind of lying, regardless of the whole discussion about filters and photos always being an interpretation of raw sensor data and whatnot.

Again, where does one draw the line? The person taking a snapshot of the moon expects a correlation between the data captured by the sensor and whatever they end up showing their friends. What if the camera only acknowledged "ok, this user is trying to photograph the moon" and replaced ALL of the sensor data with a library image of the moon it has stored in its memory? Would this be authentic or fake? It's certainly A photo of the moon, just not a photo taken with the current camera. But the user believes it's taken with their camera.

I think this is lying.