| ▲ | agilob 2 days ago | |||||||
An aircraft company discovered that it was cheaper to fly its planes with less fuel on board. The planes would be lighter and use less fuel and money was saved. On rare occasions however the amount of fuel was insufficient, and the plane would crash. This problem was solved by the engineers of the company by the development of a special OOF (out-of-fuel) mechanism. In emergency cases a passenger was selected and thrown out of the plane. (When necessary, the procedure was repeated.) A large body of theory was developed and many publications were devoted to the problem of properly selecting the victim to be ejected. Should the victim be chosen at random? Or should one choose the heaviest person? Or the oldest? Should passengers pay in order not to be ejected, so that the victim would be the poorest on board? And if for example the heaviest person was chosen, should there be a special exception in case that was the pilot? Should first class passengers be exempted? Now that the OOF mechanism existed, it would be activated every now and then, and eject passengers even when there was no fuel shortage. The engineers are still studying precisely how this malfunction is caused. | ||||||||
| ▲ | self_awareness 2 days ago | parent | next [-] | |||||||
I'm wondering which overcommit strategy this example referrs to. Because If my bitcoin price checker built on electron will start allocating all memory on the machine, then (assuming no overcommitting takes place) some arbitrary process (e.g. systemd) can get malloc error. But it's not systemd's fault the memory got eaten; so why it's being punished for low memory conditions? It's like choosing a random person to be ejected from the plane. | ||||||||
| ▲ | saghm 2 days ago | parent | prev | next [-] | |||||||
I get that this is humorous, but it seems like it illustrates the point of why this strategy is useful in the first place: memory is not human life, does not feel pain, and can even be resurrected from swap (which might still take some extra time but still is way less of an issue than the corresponding problem for humans)? If the strongest objection to the system is that it can't be ethically generalized to apply to managing people instead of memory, I think I'm happy with it. I don't care about whether it's inelegant if it works well enough for me in practice when the "victims" are arbitrary values in memory. | ||||||||
| ||||||||
| ▲ | 2 days ago | parent | prev [-] | |||||||
| [deleted] | ||||||||