▲ | xdennis 6 days ago | ||||||||||||||||||||||||||||||||||
This is what happens when you write articles with AI (the article specifically mentions ChatGPT). The article says: > A number of 70s computing systems had nine-bit bytes, most prominently the PDP-10 This is false. If you ask ChatGPT "Was the PDP-10 a 9 bit computer?" it says "Yes, the PDP-10 used a 36-bit word size, and it treated characters as 9-bit bytes." But if you ask any other LLM or look it up on Wikipedia, you see that: > Some aspects of the instruction set are unusual, most notably the byte instructions, which operate on bit fields of any size from 1 to 36 bits inclusive, according to the general definition of a byte as a contiguous sequence of a fixed number of bits. -- https://en.wikipedia.org/wiki/PDP-10 So PDP-10 didn't have 9-bit bytes, but could support them. Characters were typically 6 bytes, but 7-bit and 9-bit characters were also sometimes used. | |||||||||||||||||||||||||||||||||||
▲ | vincent-manis 6 days ago | parent [-] | ||||||||||||||||||||||||||||||||||
Actually, the PDP-10 didn't have any byte size at all, it was a word-addressed machine. (An early attempt to implement C on this machine came a cropper because of this.) It did have a Load Byte and a Store Byte instruction, which allowed you to select the byte size. Common formats were Sixbit (self-explanatory), ASCII (5 7-bit bytes and an unused bit), and (more rarely, I think), 9-bit bytes. My first machines were the IBM 7044 (36-bit word) and the PDP-8 (12-bit word), and I must admit to a certain nostalgia for that style of machine (as well as the fact that a 36-bit word gives you some extra floating-point precision), but as others have pointed out, there are good reasons for power-of-2 byte and word sizes. | |||||||||||||||||||||||||||||||||||
|