▲ | saulpw a day ago | |
I think the turning point was literally when we started booting into Windows. Before that, you had DOS, which is basically a program launcher and collection of system utilities. You could even run Windows from DOS by typing "win" (which ran "win.com"). You knew everything that was running on your computer, from the drivers you installed in CONFIG.SYS to the TSRs loaded in AUTOEXEC.BAT. But then people started putting "win" at the end of AUTOEXEC.BAT, which is a personal choice, okay. And then Microsoft shipped Windows 95 and inverted the control structure. The computer then ran whatever Microsoft wanted it to run, and you could get a terminal "window" to run DOS commands. But you had ceded control of your computer to Microsoft and the gods of complexity. | ||
▲ | bitwize a day ago | parent [-] | |
See the sibling comment to yours; I generally agree with you. The Mac had been doing this since 1984, though as of 1987 it bundled HyperCard, ameliorating the programmability situation somewhat. With the advent of the GUI it seems the tide shifted from "computers are for programming" to "computers are for running normie productivity apps". Nothing wrong with those apps, but unless I have the means to make it do what I want, I don't have a computer. I have an appliance—the dream of Gates and Jobs, all along. Providing even the means to program the thing, or to control it at a low level, became sort of anathema. Which is kind of ironic considering that modern GUIs were inspired by Smalltalk, whose programmability ran bone-deep. |