Remix.run Logo
mgraczyk a day ago

I think Samsung was doing what was alleged, but as somebody who was working on state of the art algorithms for camera processing at a competitor while this was happening, this experiment does not prove what is alleged. Gaussian blurring does not remove the information, you can deconvolve and it's possible that Samsung's pre-ML super resolution was essentially the same as inverting a gaussian convolution

userbinator a day ago | parent [-]

If you read the original source article, you'll find this important line:

I downsized it to 170x170 pixels

mgraczyk a day ago | parent [-]

And? What algorithm was used for downsampling? What was the high frequency content of the downsampled imagine after doing a psuedo inverse with upsampling? How closely does it match the Samsung output?

My point is that there IS an experiment which would show that Samsung is doing some nonstandard processing likely involving replacement. The evidence provided is insufficient to show that

Dylan16807 21 hours ago | parent | next [-]

You can upscale a 170x170 image yourself, if you're not familiar with what that looks like. The only high frequency details you have after upscaling are artifacts. This thing pulled real details out of nowhere.

mgraczyk 21 hours ago | parent [-]

That is not true

For example see

https://en.wikipedia.org/wiki/Edge_enhancement

Dylan16807 21 hours ago | parent [-]

That example isn't doing any scaling.

You can try to guess the location of edges to enhance them after upscaling, but it's guessing, and when the source has the detail level of a 170x170 moon photo a big proportion of the guessing will inevitably be wrong.

And in this case it would take a pretty amazing unblur to even get to the point it can start looking for those edges.

mgraczyk 21 hours ago | parent [-]

You're mistaken and the original experiment does not distinguish between classic edge aware upscaling/super resolution vs more problematic replacement

Dylan16807 21 hours ago | parent [-]

I'm mistaken about which part? Let's start here:

You did not link an example of upscaling, the before and after are the same size.

Unsharp filters enhance false edges on almost all images.

If you claim either one of those are wrong, you're being ridiculous.

mgraczyk 17 hours ago | parent [-]

I think if you paste our conversation into ChatGPT it can explain the relevant upsampling algorithms. There are algorithms that will artificially enhance edges in a way that can look like "AI", for example everything done on pixel phones prior to ~2023

And to be clear, everyone including Apple has been doing this since at least 2017

The problem with what Samsung was doing is that it was moon-specific detection and replacement

userbinator 20 hours ago | parent | prev [-]

You have clearly made no attempts to read the original article which has a lot more evidence (or are actively avoiding it), and somehow seem to be defending Samsung voraciously but emptily, so you're not worth arguing with and I'll just leave this here:

I zoomed in on the monitor showing that image and, guess what, again you see slapped on detail, even in the parts I explicitly clipped (made completely 100% white):

mgraczyk 17 hours ago | parent [-]

> somehow seem to be defending Samsung voraciously but emptily

The first words I said were that Samsung probably did this

And you're right that I didn't read the dozens of edits which were added after the original post. I was basing my arguments off everything before the "conclusion section", which it seems the author understands was not actually conclusive.

I agree that the later experiments, particularly the "two moons" experiment were decisive.

Also to be clear, I know that Samsung was doing this, because as I said I worked at a competitor. At the time I did my own tests on Samsung devices because I was also working on moon related image quality