| ▲ | AlotOfReading 32 minutes ago | |
You haven't explained what the benefit is. There aren't "spaces we haven't formalized" because of the pigeonhole principle. There are M bits. You can generate every one of those 2^M values with any max cycle permutation. What work is being offloaded from computers to people? It's exactly the same thing with more determinism and no logarithmic overhead. | ||
| ▲ | pfdietz 14 minutes ago | parent [-] | |
> There aren't any "spaces we haven't formalized" Suppose that space of N points is partitioned into M relevant subsets, for now we assume of the same size. Then random sampling hits each of those subsets in O(M log M) time, even if we don't know what they are. This sort of partitioning is long talked about in the testing literature, with the idea you should do it manually. > what work is being offloaded The need to write that program for explicitly enumerating the space. | ||