Remix.run Logo
kibwen 4 hours ago

> It’s the only kind of program that can be actually reasoned about.

Theoretically infinite memory isn't really the problem with reasoning about Turing-complete programs. In practice, the inability to guarantee that any program will halt still applies to any system with enough memory to do anything more than serve as an interesting toy.

I mean, I think this should be self-evident: our computers already do have finite memory. Giving a program slightly less memory to work with doesn't really change anything; you're still probably giving that statically-allocated program more memory than entire machines had in the 80s, and it's not like the limitations of computers in the 80s made us any better at reasoning about programs in general.

d3ckard 4 hours ago | parent [-]

Yes, but allocations generate ever increasing combinatorial space of possible failure modes.

Static allocation requires you to explicitly handle overflows, but also by centralizing them, you probably need not to have as many handlers.

Technically, all of this can happen as well in language with allocations. It’s just that you can’t force the behavior.

kibwen 3 hours ago | parent [-]

Sure, but let's be clear: it's a tradeoff. If every program reserved as much memory at startup as needed to service 100% of its theoretically-anticipated usage, the amount of programs we could run in parallel would be drastically reduced. That is to say, static allocation makes OOM conditions dramatically more likely by their very nature, because programs are greedily sitting on unused memory that could be doled out to other processes.