▲ | phkahler 6 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||
Good points! I was going to say I think 12 bits would have been a nice choice, but yeah optimizing for circuits is kind of important. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | ForOldHack 6 days ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||
Brillant, that 36 bits would be three bytes. "DEC's 36-bit computers were primarily the PDP-6 and PDP-10 families, including the DECSYSTEM-10 and DECSYSTEM-20. These machines were known for their use in university settings and for pioneering work in time-sharing operating systems. The PDP-10, in particular, was a popular choice for research and development, especially in the field of artificial intelligence. " "Computers with 36-bit words included the MIT Lincoln Laboratory TX-2, the IBM 701/704/709/7090/7094, the UNIVAC 1103/1103A/1105 and 1100/2200 series, the General Electric GE-600/Honeywell 6000, the Digital Equipment Corporation PDP-6/PDP-10 (as used in the DECsystem-10/DECSYSTEM-20), and the Symbolics 3600 series. Smaller machines like the PDP-1/PDP-9/PDP-15 used 18-bit words, so a double word was 36 bits. Oh wait. Its already been done. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|