| ▲ | measurablefunc 2 days ago | |||||||
Decompression is equivalent to executing code for a specialized virtual machine. It should be possible to automate this process of finding "small" programs that generate "large" outputs. Could even be an interesting AI benchmark. | ||||||||
| ▲ | shakna 2 days ago | parent | next [-] | |||||||
Many of them already do this. [0] It is a much easier problem to solve than you would expect. No need to drag in a data centre when heuristics can get you close enough. [0] https://sources.debian.org/patches/unzip/6.0-29/23-cve-2019-... | ||||||||
| ||||||||
| ▲ | bikeshaving 2 days ago | parent | prev [-] | |||||||
My guess is this is a subset of the halting problem (does this program accept data with non-halting decompression), and is therefore beautifully undecidable. You are free to leave zip/tgz/whatever fork bombs as little mines for live-off-the-land advanced persistent threats in your filesystems. | ||||||||
| ||||||||