| ▲ | pjmlp 3 hours ago |
| Interesting read, however as someone from the same age group as Casey Muratori, this does not make much sense. > The "immediate mode" GUI was conceived by Casey Muratori in a talk over 20 years ago. Maybe he might have made it known to people not old enough to have lived through the old days, however this is how we used to program GUIs in 8 and 16 bit home computers, and has always been a thing in game consoles. |
|
| ▲ | JimDabell an hour ago | parent | next [-] |
| I think this is the source of the confusion: > To describe it, I coined the term “Single-path Immediate Mode Graphical User Interface,” borrowing the “immediate mode” term from graphics programming to illustrate the difference in API design from traditional GUI toolkits. — https://caseymuratori.com/blog_0001 Obviously it’s ludicrous to attribute “immediate mode” to him. As you say, it’s literally decades older than that. But it seems like he used immediate mode to build a GUI library and now everybody seems to think he invented immediate mode? |
|
| ▲ | kllrnohj an hour ago | parent | prev | next [-] |
| There's also good reasons that immediate mode GUIs are largely only ever used by games, they are absolutely terrible for regular UI needs. Since Rust gaming is still largely non-existent, it's hardly surprising that things like 'egui' are similarly struggling. That doesn't (or shouldn't) be any reflection on whether or not Rust GUIs as a whole are struggling. Unless the Rust ecosystem made the easily predicted terrible choice of rallying behind immediate mode GUIs for generic UIs... |
|
| ▲ | Jtsummers 2 hours ago | parent | prev | next [-] |
| It's like the common claim that data-oriented programming came out of game development. It's ahistorical, but a common belief. People can't see past their heroes (Casey Muratori, Jonathon Blow) or the past decade or two of work. |
| |
| ▲ | dijit an hour ago | parent | next [-] | | I partly agree, but I think you're overcorrecting. Game developers didn't invent data-oriented design or performance-first thinking. But there's a reason the loudest voices advocating for them in the 2020s come from games: we work in one of the few domains where you literally cannot ship if you ignore cache lines and data layout. Our users notice a 5ms frame hitch- While web developers can add another React wrapper and still ship. Computing left game development behind. Whilst the rest of the industry built shared abstractions, we worked in isolation with closed tooling. We stayed close to the metal because there was nothing else. When Casey and Jon advocate for these principles, they're reintroducing ideas the broader industry genuinely forgot, because for two decades those ideas weren't economically necessary elsewhere. We didn't preserve sacred knowledge. We just never had the luxury of forgetting performance mattered, whilst the rest of computing spent 20 years learning it didn't. | | |
| ▲ | Jtsummers an hour ago | parent [-] | | > I think you're overcorrecting. I don't understand this part of your comment, it seems like you're replying to some other comment or something not in my comment. How am I overcorrecting? A statement of fact, that game developers didn't invent these things even though that's a common belief, is not an overcorrection. It's just a correction. | | |
| ▲ | dijit 22 minutes ago | parent [-] | | Ah, I read your comment as "game devs get too much credit for this stuff and people are glorifying Casey and Jon" and ran with that, but you were just correcting the historical record. My bad. I think we're aligned on the history; I was making a point about why they're prominent advocates today (and why people are attributing invention to them) even though they didn't invent the concepts. |
|
| |
| ▲ | moregrist 32 minutes ago | parent | prev [-] | | It clearly didn’t come out of game dev. Many people doing high performance work on either embedded or “big silicon” (amd64) in that era were fully aware of the importance of locality, branch prediction, etc But game dev, in particular Mike Acton, did an amazing job of making it more broadly known. His CppCon talk from 2014 [0] is IMO one of the most digestible ways to start thinking about performance in high throughput systems. In terms of heroes, I’d place Mike Acton, Fabian Giesen [1], and Bruce Dawson [2] at the top of the list. All solid performance-oriented people who’ve taken real time to explain how they think and how you can think that way as well. I miss being able to listen in on gamedev Twitter circa 2013 before all hell broke loose. [0] https://youtu.be/rX0ItVEVjHc?si=v8QJfAl9dPjeL6BI [1] https://fgiesen.wordpress.com/ [2] https://randomascii.wordpress.com/ |
|
|
| ▲ | piker 3 hours ago | parent | prev | next [-] |
| I mean, fair enough, but [at least] wikipedia agrees with that take. > Graphical user interfaces traditionally use retained mode-style API design,[2][5] but immediate mode GUIs instead use an immediate mode-style API design, in which user code directly specifies the GUI elements to draw in the user input loop. For example, rather than having a CreateButton() function that a user would call once to instantiate a button, an immediate-mode GUI API may have a DoButton() function which should be called whenever the button should be on screen.[6][5] The technique was developed by Casey Muratori in 2002.[6][5] Prominent implementations include Omar Cornut's Dear ImGui[7] in C++, Nic Barker's Clay[8][9] in C and Micha Mettke's Nuklear[10] in C. https://en.wikipedia.org/wiki/Immediate_mode_(computer_graph... [Edit: I'll add an update to the post to note that Casey Muratori simply “coined the term” but that it predates his video.] |
| |
| ▲ | pjmlp 2 hours ago | parent | next [-] | | Dig out any source code for Atari, Spectrum or Commodore 64 games, written in Assembly, or early PC games, for example. And you will see which information is more accurate. | | |
| ▲ | piker 2 hours ago | parent [-] | | Yeah no doubt you're correct. I wasn't disagreeing - just establishing the reasonableness of my original statement. I must have read it in the Dear ImGui docs somewhere. |
| |
| ▲ | an hour ago | parent | prev | next [-] | | [deleted] | |
| ▲ | vodou 2 hours ago | parent | prev | next [-] | | I am pretty sure there are people here qualified enough to edit that Wikipedia page in a proper way. | |
| ▲ | arandomhuman 2 hours ago | parent | prev [-] | | Wikipedia clearly has never been shown to have faults regarding accuracy. | | |
|
|
| ▲ | PKop 2 hours ago | parent | prev [-] |
| > Maybe he might have made it known to people Yes, he coined the term rather than invent the technique |
| |
| ▲ | adastra22 an hour ago | parent | next [-] | | He definitely did not name it. IRIS GL was termed “immediate mode” back in the 80’s. | | |
| ▲ | andypants 39 minutes ago | parent [-] | | He coined the term in the context of UI, by borrowing the existing term that was already used in graphics. Drawing that parallel was the point. | | |
| ▲ | adastra22 9 minutes ago | parent [-] | | It might be more accurate to say that he repopularized the term among a new generation of developers. Immediate vs Retained mode UI was just as much a thing in early GUIs. It was a swinging pendulum. At first everything was immediate mode because video RAM was very scarce. Initially there was only enough VRAM for the frame buffer, and hardly any system RAM to spare. But once both categories of RAM started growing, there was a movement to switch to retained mode UI frameworks. It wasn’t until the early 00’s that GPUs and SIMD extensions tipped the scales in the other direction - it was faster to just re-render as needed rather than track all these cached UI buffers, and allowed for dynamic UI motifs “for free.” My graying beard is showing though, as I did some gave dev in the late 90’s on 3Dfx hardware, and learned UI programming on Win95 and System 7.6. Get off my lawn. |
|
| |
| ▲ | pjmlp 2 hours ago | parent | prev [-] | | I won't be bothered to go hunting for digital copies of 1980's game development books, but I have my doubts on that. |
|