| ▲ | Rochus 13 hours ago |
| Eniac was indeed impressive and an important milestone. I recommend the 1999 book "ENIAC - The triumphs and tragedies of the world's first computer" by Scott McCartney which is both interesting to read and very informative. Also the review of the book by the late Jean Bartik, one of the "computers" and thus an eyewitnmess, is very interesting: https://web.archive.org/web/20221101120020/https://www.amazo.... Though the article is very US focussed, keeping quiet that German engineer Konrad Zuse completed the Z3 in May 1941, five years before ENIAC, effectively creating the world's first working programmable and fully automatic digital computer. While ENIAC required days of manual cable patching to program, the Z3 was quickly programmed by a punched tape ("Lochstreifen"), and Zuse also has invented Plankalkül between 1942 and 1945, which is widely recognized as the world's first high-level programming language. The cooperation between Zuse and ETH Zurich eventually led to the first self-compiling compiler and eventually Algol 60 (see "The European Side of the Last Phase of the Development of ALGOL 60" by Peter Naur in ACM SIGPLAN "History of Programming Languages" from 1978). And there was also the British Colossus, which was also a "programmable computer" and successfully utilized vacuum tubes for code-breaking by early 1944. |
|
| ▲ | jcranmer 8 hours ago | parent | next [-] |
| There are three main problems with trying to offer a simple answer to the question of "what is the first computer?" The most obvious of the problems is that a computer isn't a singular technology that springs up de novo, but something that develops from antecedents over a long, messy transition problem that requires a judgement call as to when the proto-computer becomes an actual computer. A judgement call which is obviously going to be biased based on the other considerations. Consider, for a more contemporary example, what you would argue as the "first smartphone" or the "first LLM." Personally, I think the ENIAC is still somewhat too proto-computer for my tastes: I'd prefer a "first" that uses binary arithmetic and has stored programs, neither of which is true for the ENIAC. The second major issue is it's also instructive to look at the candidates' influence on later development. Among the contenders for "first computer," it's unfortunately kinda clear that ENIAC has the most lasting influence. ENIAC's development produced the papers that directly inspires the next generation of machines. Colossus is screwed here because of the secrecy of the code-breaking effort. Meanwhile, Zuse and Z3 suffer from being on the losing end of WW2. ABC has a claim here, but it's not clear whether or not the developers of ENIAC drew influence from ABC or not. The final major issue isn't so much an issue by itself but rather something that colors the interpretation of the first two issues: national pride. An American is far more likely to weight the influence and ingenuity of the ENIAC and similar machines to label one of them the "first computer." A UK person would instead prefer to crown Colossus or the Manchester Baby. A German would prefer the Z3. |
| |
| ▲ | alephnil 6 hours ago | parent [-] | | In many ways the ENIAC was more like an FPGA than a computer. It was programmed with patch cables connecting the different computational units as well as switches, and had no CPU as such. The cables had to be physically rerouted when changing to a new program, which took weeks. My understanding is that it was eventually programmed to emulate a von Neuman machine around 1948/49. As far as I understand, this was done mainly by Jean Bartik based on Von Neumans ideas. If this is correct, it was not a von Neuman machine originally, but it eventually became one, and at approximately the same time as the Manchester Baby. |
|
|
| ▲ | mrob 12 hours ago | parent | prev | next [-] |
| The Z3 was only general purpose by accident, and this was only discovered in 1997 (published 1998). [0] It's only of theoretical interest because the technique required is too inefficient for real-world applications. ENIAC is notable because it was the first intentionally general purpose computer to be built. [0] https://www.inf.fu-berlin.de/inst/ag-ki/rojas_home/documents... |
| |
| ▲ | adrian_b 12 hours ago | parent | next [-] | | I do not think that it is right at all to say "intentionally general purpose computer". ENIAC was built for a special purpose, the computation of artillery tables. It was a bespoke computer built for a single customer: the United States Army's Ballistic Research Laboratory. This is why it has been designed as the digital electronic equivalent of the analog mechanical computers that were previously used by the Army and why it does not resemble at all what is now meant by "general-purpose computer". The computers of Aiken and Zuse were really intentionally general-purpose, their designers did not have in mind any specific computation, which is why they were controlled by a program memory, not by a wiring diagram. What you claim about Z3 being general purpose by accident does not refer to the intention of its designer, but only to the fact that its instruction set was actually powerful enough by accident, because at that early time it was not understood which kinds of instructions are necessary for completeness. All the claims made now about ENIAC being general-purpose are retroactive. Only after the war ended and the concept of a digital computer became well understood, the ENIAC was repurposed to also do other tasks than originally planned. The first truly general-purpose electronic digital computers that were intentionally designed to be so were those designed based on the von Neumann report. Before the completion of the first of those, there were general-purpose hybrid electronic-electromechanical digital computers, IBM SSEC being the most important of them, which solved a lot of scientific and technical problems, before electronic computers became available. | | |
| ▲ | rootbear 5 hours ago | parent [-] | | A counter argument is that Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC. He could only get ENIAC funded if it was valuable to the war effort. I’ve read quite a lot about that machine and I’m not aware of any architectural features that were specific to ballistics calculations. This is unlike the British Colossus, another early computer, which was specifically designed for code breaking and wasn’t general purpose. As for the objection that it wasn’t stored program, I was interested to learn that it was converted to stored program operation after only two years or so of operation, using the constant table switches as the program store. But the Manchester Baby, which used the same memory for code and data was more significant in the history of stored program machines. On the general question of “first computer”, I think the answer is whatever machine you want it to be if you heap enough conditional adjectives on it. | | |
| ▲ | Rochus 5 hours ago | parent [-] | | > Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC True. Mauchly was a physics professor interested in meterology, and he knew that predicting the weather and calculating an artillery shell's flight are mathematically the same type of problem, which was important to get funding. In the fifties, Eniac was even used to calculate weather forecasts (see https://ams.confex.com/ams/2020Annual/webprogram/Manuscript/...). So these were just two related special problems, and it would be a stretch to interpret this as an intention to build a general-purpose computer. The latter had to wait until the sixties. |
|
| |
| ▲ | Rochus 10 hours ago | parent | prev | next [-] | | > The Z3 was only general purpose by accident ... ENIAC [..] was the first intentionally general purpose computer That's a pretty academic take. Neither Eckert, nor Mauchly, nor Zuse knew about Alan Turing’s 1936 paper when they designed their machines. The classification of ENIAC (and the Z3) as a "universal Turing machine" is entirely a retroactive reinterpretation by later computer scientists. John von Neumann knew the paper and was aware of its significance, but he only turned up in the ENIAC project when the design was complete. At this time, Eckert and Mauchly were already well aware of ENIAC's biggest flaw (the massive effort to reprogram the machine, and in fact they came up with the stored-program concept which von Neumann later formalized). ENIAC’s funding and primary justification were for the very specific purpose of calculating artillery firing tables for the military. The machine was built for this purpose, which included the feature which retroactively led to the mentioned classification. | |
| ▲ | ahartmetz 12 hours ago | parent | prev [-] | | Still feels like history written by the victors (of WW2 and computing, eventually) in this case. If you want to be mathematically precise, it's been proven to be Turing-complete. If you want to use common sense (IMO better), it was one of the most significant leaps in automated computation and simply didn't need to do more for its intended applications. For conditional branches to make sense, you also need a fast temporary storage array (since it would be awfully slow running directly off tape like a Turing machine), and to realize that all that effort makes sense, you first need to play with a computer for a while and discover the new possibilities. |
|
|
| ▲ | 13 hours ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | yaakov34 9 hours ago | parent | prev [-] |
| The Z3 was not a general purpose computer; it was a calculator that performed a predetermined sequence of operations that was written to its tape. It was remarkable for being all-binary in an era when differential gears and cams were very common in computing devices, and had some other advanced features. But the 1990s article that declared it Turing-complete is just silly. It would apply to every four-function calculator that supports rounding, and programming a computer like that is not just "impractical" - both the tape and execution time would grow exponentially in number of branches - but it is not the model that Turing proposed. The whole point of Turing's (theoretical) device is that a short program using the abilities of that device could perform unlimited computations; if you make the program length unlimited instead, that's a much less interesting model of computation. The problem is that anything that gets into Wikipedia becomes ingrained in the Internet's collective mind, which then can't be changed. |
| |
| ▲ | ogogmad 6 hours ago | parent [-] | | Would it not have been easy to add branch instructions to it? Just rewind the instruction tape however many places. It seems 99% of the job was done. |
|