▲ | dmurray 2 days ago | |||||||||||||||||||||||||||||||||||||
I think you can make some argument about why this isn't possible at 50:1 odds. A plausible "decompressor" is at least, say, 30 or 100 bytes, so the random file needs to have 30 bytes less entropy than you expected, which happens with probability X where X << 1/50. Sum over the whole domain of reasonable decompressors, and you still don't get there. This argument could do with more rigor, but I think it's correct. Give me 100 million to 1 odds, though, and I'll take my chances trying to brute force a compressor. | ||||||||||||||||||||||||||||||||||||||
▲ | lambdaone 2 days ago | parent [-] | |||||||||||||||||||||||||||||||||||||
This is actually an extremely interesting question. 'Weak' files that are more easily compressable than others certainly exist, but with low probability. For example, the all-zeros file is a member of the set of all random 3 megabyte files, and it would certainly be possible to compress that, if by great good fortune you were lucky enough to receive it - albeit something that is unlikely to ever happen in the possible lifetime of the universe, given current physical theories. Is it possible to quantify this idea of a 'weak' file more accurately? | ||||||||||||||||||||||||||||||||||||||
|