| ▲ | On the Design of Programming Languages (1974) [pdf](web.cs.ucdavis.edu) |
| 67 points by jruohonen 3 days ago | 23 comments |
| |
|
| ▲ | carlos256 5 hours ago | parent | next [-] |
| He has some good points. This one is from a different paper (Good Ideas, Through the Looking Glass): Designers had ignored both the issue of efficiency and that a language serves the human reader, not just the automatic parser. If a language poses difficulties to parsers, it surely also poses difficulties for the human reader. Many languages would be clearer and cleaner had their designers been forced to use a simple parsing method. |
|
| ▲ | wolvesechoes 5 hours ago | parent | prev | next [-] |
| Who are Wirths, Dijkstras, Hoares, McCarthies and Keys of today? I mean - who represents current generation of such thinkers? Genuinely asking. Most stuff I see here and in other places is about blogposts, videos and rants made by contemporary "dev influencers" and bloggers (some of them very skilled and capable of course, very often more than I am), but I would like to be in touch with something more thoughtful and challenging. |
| |
| ▲ | cmontella 2 hours ago | parent | next [-] | | Contemporary PL designers who have inspired my programming language design journey the most are people like Chris Granger (Eve), Jamie Brandon (Eve/Imp/others), Bret Victor (Dynamicland), Chris Lattner (Swift / Mojo), Simon Peyton Jones (GHC/Verse), Rich Hickey (Clojure), and Jonathan Edwards (Subtext). My favorite researcher is Amy J. Ko for her unique perspective on the nature of languages. Check out her language "Wordplay" which is very interesting. | | | |
| ▲ | artemonster 4 hours ago | parent | prev [-] | | very hot and edgy take: theoretical CS is vastly overrated and useless. as someone who actively studied the field, worked on contemporary CPU archs and still doing some casual PL research - asides from VERY FEW instances from theoretical CS about graphs/algos there is little to zero impact on our practical developments in the overall field since 80s. all modern day Dijkstras produce slop research about waving dynamic context into java program by converting funds into garbage papers. more deep CS research is totally lost in some type gibberish or nonsense formalisms. IMO research and science overall is in a deep crisis and I can clearly see it from CS perspective | | |
| ▲ | cjfd 2 hours ago | parent | next [-] | | Well, I think there is something to it. Computers were at some point newly invented so research in algorithms suddenly became much more applicable. This opened up a gold mine of research opportunities. But like real life mines at some point they get depleted and then the research becomes much less interesting unless you happen to be interested in niche topics. But, of course, the paper mill needs to keep running and so does the production of PhDs. | |
| ▲ | rramadass 35 minutes ago | parent | prev | next [-] | | > theoretical CS is vastly overrated and useless > as someone who actively studied the field, Does not compute. Your comment is mere empty verbiage with no information. | |
| ▲ | adrian_b 3 hours ago | parent | prev | next [-] | | I assume that you are talking about modern "theoretical CS", because among the "theoretical CS" papers from the fifties, sixties, seventies, and even some that are more recent I have found a lot that remain very valuable and I have seen a lot of modern programmers who either make avoidable mistakes or they implement very suboptimal solutions, just because they are no longer aware of ancient research results that were well known in the past. I especially hate those who attempt to design new programming languages today, but then demonstrate a complete lack of awareness about the history of programming languages, by introducing a lot of design errors in their languages, which had been discussed decades ago and for which good solutions had been found at that time, but those solutions were implemented in languages that never reached the popularity of C and its descendants, so only few know about them today. | |
| ▲ | pjmlp 2 hours ago | parent | prev [-] | | Indeed, we don't really need affine type systems, what use could we get for them in the industry. /s |
|
|
|
| ▲ | notarobot123 3 hours ago | parent | prev | next [-] |
| The key, then, lies not so much in minimising the number of basic features of a language, but rather in keeping the included facilities simple to understand in all their consequences of usage and free from unexpected interactions when they are combined. A form must be found for these facilities which is convenient to remember and intuitively clear to a programmer, and which acts as a natural guidance in the formulation of [their] ideas. We've successfully found some strong patterns for structuring programs that transform data in various ways for the kinds of programs Wirth was imagining. The best patterns have proven themselves by being replicated across languages (for example discriminated unions and pattern matching) and the worst have died away (things like goto and classical inheritance). There's still work to do to find better languages though. A language is good if it fits the shape of the problem and, while we've found some good patterns for some shapes of problems, there are a lot more problems without good patterns. I had hoped there'd be more languages for everyday end-user problems by now. At the start of the SaaS era it seemed like a lot of services were specific solutions that might fit into a more general modelling language. That hasn't happened yet but maybe a programming language at just the right level of abstraction could make that possible. |
| |
| ▲ | wolvesechoes 3 hours ago | parent [-] | | > and the worst have died away (things like goto and classical inheritance) What's so wrong about classical inheritance, and how it died away while being well-supported in most popular programming languages of today (Python, C++, Java, C#, TS, Swift)? | | |
| ▲ | Someone 2 hours ago | parent [-] | | Inheritance has its uses, but is easily overused. In a sense, it’s like global variables. About every complex program [1] has a few of them, so languages have to support them, but you shouldn’t have too many of them, and people tend to say “don’t use globals”. [1] some languages such as classical Java made it technically impossible to create them, but you can effectively create one with class Foo {
public static int bar;
}
If you’re opposed to that, you’ll end up with making that field non-static and introducing a singleton instance of “Foo”, again effectively creating a global.In some Java circles, programmers will also wrap access to that field in getters and setters, and then use annotations to generate those methods, but that doesn’t make such fields non-global. | | |
| ▲ | wolvesechoes 2 hours ago | parent [-] | | > Inheritance has its uses, but is easily overused. This I can agree with, but it is far from being "worst pattern".
Everything can be like salt. |
|
|
|
|
| ▲ | augustk 2 hours ago | parent | prev | next [-] |
| It's also worth noting that statements like for (i = 1; i <= 100; i++) {
S;
if (P) {
break;
}
}
are just as bad since `break' (and `continue' and early `return') are a just gotos in disguise. |
| |
| ▲ | matthewkayin 13 minutes ago | parent [-] | | They are just gotos, but does that mean that they are bad (along with their friend try/catch, who is also a goto?), or does that mean that gotos can be useful when used with restraint? Gotos get a bad rep because they become spaghetti when misused. But there are lots of cases where using gotos (or break/continue/early return/catch) makes your code cleaner and simpler. Part of a programmer's job is to reason about code. By creating black and white rules like "avoid gotos", we attempt to outsource the thinking required of us out to some religious statement. We shouldn't do that. Gotos can be useful and can lead to good code. They can also be dangerous and lead to bad code. But no "rule of thumb" or "programming principle" will save you from bad code. |
|
|
| ▲ | palad1n 7 hours ago | parent | prev | next [-] |
| I think the legend goes Wirth created the Pascal language to be the most easily compilable. To show my age, I recall a class used Modula-2 when I was in college, also from Wirth, very Pascal-like. |
| |
| ▲ | tristramb an hour ago | parent | next [-] | | I seem to remember (but I can't find the source) that Wirth initially had three aims in designing Pascal: 1. To use it in teaching a structure programming course to new students. As in the late 60's all student programming was batch mode (submit your program to an operator to run, and pick up the printout the following day), this meant the compiler had to be single-pass and give good error messages. 2. To use it in teaching a data structures course involving new data structures worked out by Wirth and Hoare. 3. To use it in teaching a compilers course. This meant the compiler code had to be clean and understandable. Being single-pass helped in this. | |
| ▲ | pjmlp 5 hours ago | parent | prev | next [-] | | Nowadays you can enjoy it on GCC, as it is now an officially supported frontend, after GNU Modula-2 got merged into it. https://gcc.gnu.org/onlinedocs/gcc-15.2.0/gm2 Even available on compiler explorer to play with, https://godbolt.org/z/ev9Pbxn9K Yes, that was a common trend across all programming languages designed by him. That is also how P-Code came to be, he didn't want to create a VM for Pascal, rather the goal was to make porting easier, by requiring only a basic P-Code interpreter, it was very easy to port Pascal, a design approach he kept for Modula-2 (M-Code) and Oberon (Slim binaries). | |
| ▲ | zabzonk 6 hours ago | parent | prev [-] | | > most easily compilable I think it was more that it would be easy to write a compiler for, which meant that CS students could write one. Don't have a source for this that I can remember, though. |
|
|
| ▲ | rramadass 38 minutes ago | parent | prev | next [-] |
| Related must-read is Wirth's Turing Award Lecture; From Programming Language Design to Computer Construction (pdf) - http://pascal.hansotten.com/uploads/wirth/TuringAward.pdf |
|
| ▲ | vintagedave 5 hours ago | parent | prev | next [-] |
| I saw on page 25 (the third PDF page) a nice argument against variable shadowing. I can think of a couple of modern languages I wish had learned this ;) |
| |
|
| ▲ | medi8r 5 hours ago | parent | prev [-] |
| Looks like AI slop to me :) |