Remix.run Logo
godelski 4 days ago

There's tons of backlash here as if people think better performance requires writing in assembly.

But to anyone complaining, I want to know, when was the last you pulled out a profiler? When was the last time you saw anyone use a profiler?

People asking for performance aren't pissed you didn't write Microsoft Word in assembly we're pissed it takes 10 seconds to open a fucking text editor.

I literally timed it on my M2 Air. 8s to open and another 1s to get a blank document. Meanwhile it took (neo)vim 0.1s and it's so fast I can't click my stopwatch fast enough to properly time it. And I'm not going to bother checking because the race isn't even close.

I'm (we're) not pissed that the code isn't optional, I'm pissed because it's slower than dialup. So take that Knuth quote you love about optimization and do what he actually suggested. Grab a fucking profiler, it is more important than your Big O

nwallin 4 days ago | parent | next [-]

Another datapoint that supports your argument is the Grand Theft Auto Online (GTAO) thing a few months ago.[0] GTAO took 5-15 minutes to start up. Like you click the icon and 5-15 minutes later you're in the main menu. Everyone was complaining about it for years. Years. Eventually some enterprising hacker disassembled the binary and profiled it. 95% of the runtime was in `strlen()` calls. Not only was that where all the time was spent, but it was all spent `strlen()`ing the exact same ~10MB resource string. They knew exactly how large the string was because they allocated memory for it, and then read the file off the disk into that memory. Then they were tokenizing it in a loop. But their tokenization routine didn't track how big the string was, or where the end of it was, so for each token it popped off the beginning, it had to `strlen()` the entire resource file.

The enterprising hacker then wrote a simple binary patch that reduced the startup time from 5-10 minutes to like 15 seconds or something.

To me that's profound. It implies that not only was management not concerned about the start up time, but none of the developers of the project ever used a profiler. You could just glance at a flamegraph of it, see that it was a single enormous plateau of a function that should honestly be pretty fast, and anyone with an ounce of curiousity would be like, ".........wait a minute, that's weird." And then the bug would be fixed in less time than it would take to convince management that it was worth prioritizing.

It disturbs me to think that this is the kind of world we live in. Where people lack such basic curiosity. The problem wasn't that optimization was hard, (optimization can be extremely hard) it was just because nobody gave a shit and nobody was even remotely curious about bad performance. They just accepted bad performance as if that's just the way the world is.

[0] Oh god it was 4 years ago: https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...

godelski 4 days ago | parent | next [-]

I just started getting back into gaming and I'm seeing shit like this all the time. It's amazing that stuff like this is so common while the Quake fast inverse square root algo is so well known.

How is it that these companies spend millions of dollars to develop games and yet modders are making patches in a few hours fixing bugs that never get merged. Not some indie game, but AAA rated games!

I think you're right, it's on both management and the programmers. Management only knows how to rush but not what to rush. The programmers fall for the trap (afraid to push back) and never pull up a profiler. Maybe over worked and over stressed but those problems never get solved if no one speaks up and everyone is quiet and buys into the rush for rushing's sake mentality.

It's amazing how many problems could be avoided by pulling up a profiler or analysis tool (like Valgrind).

It's amazing how many millions of dollars are lost because no one ever used a profiler or analysis tool.

I'll never understand how their love for money makes them waste so much of it.

bigstrat2003 4 days ago | parent [-]

AAA games are, largely, quite bad in quality these days. Unfortunately, the desire to make a quality product (from the people who actually make the games) is overruled by the desire to maximize profit (from the people who pay their salaries). Indie games are still great, but I barely even bother to glance at AAA stuff any more.

godelski 4 days ago | parent | next [-]

  > by the desire to
An appropriate choice of words.

I'm just wondering if/when anyone will realize that often desire gets in the way of achieving. ̶T̶h̶e̶y̶ ̶m̶a̶y̶ ̶b̶e̶ ̶p̶e̶n̶n̶y̶ ̶w̶i̶s̶e̶ ̶b̶u̶t̶ ̶t̶h̶e̶y̶'̶r̶e̶ ̶p̶o̶u̶n̶d̶ ̶f̶o̶o̶l̶i̶s̶h̶.̶ Chasing pennies with dollars

pjmlp 3 days ago | parent | prev [-]

That has been like that since there have been publishers in the games industry.

Back then, the indies stuff was only if you happened to live nearby someone you knew doing bedroom coding, distributing tapes on school, or they got lucky land their game on one of those shareware tapes collection.

Trying to actually get a publisher deal was really painful, and if you did, they really wanted their money back in sales.

versteegen 3 days ago | parent [-]

Shareware tapes collection? Was there really such a thing? If so I would imagine it would be one or two demos per tape?

pjmlp 3 days ago | parent [-]

Yes there was such a thing, for those of us that leaved throught the 1980's.

There are tons of games that you can fit into 60m, 90m, or 180m tapes, when 48 KB/128 KB is all you got.

More like 20 or something.

Magazines like Your Sinclair and Crash would have such cassete tapes,

https://archive.org/details/YourSinclair37Jan89/YourSinclair...

https://www.crashonline.org.uk/

They would be glued into the magazine with adhesive tape, and later on to avoid them being stolen, the whole magazine plus tape would be in a plastic.

bakugo 3 days ago | parent | prev | next [-]

> To me that's profound. It implies that not only was management not concerned about the start up time, but none of the developers of the project ever used a profiler.

Odds are that someone did notice it during profiling and filed a ticket with the relevant team to have it fixed, which was then set to low priority because implementing the latest batch of microtransactions was more important.

I feel like this is just a natural consequence of the metrics-driven development that is so prevalent in large businesses nowadays. Management has the numbers showing them how much money they make every time they add a new microtransaction, but they don't have numbers showing them how much money they're losing due to people getting tired of waiting 15 minutes for the game to load, so the latter is simply not acknowledged as a problem.

skeaker 3 days ago | parent | prev | next [-]

iirc this bug existed from release but didn't impact the game until years later after a sizable number of DLCs were added to the online mode, since the function only got slower with each one added. Not that it's fine that the bug stayed in that long, but you can see how it would be missed given that when they had actual programmers running profilers at development time it wouldn't have raised any red flags after completing in ten seconds or whatever.

tinyhitman 3 days ago | parent [-]

I don't know. As a developer there would be even more reason to be curious as to why the release binary is an order of magnitude slower then what is seen in development.

p_l 2 days ago | parent [-]

At release it was "working fine, same as in dev".

It slowed down gradually as the JSON manifest of optional content grew.

mschuster91 3 days ago | parent | prev | next [-]

> It disturbs me to think that this is the kind of world we live in. Where people lack such basic curiosity. The problem wasn't that optimization was hard, (optimization can be extremely hard) it was just because nobody gave a shit and nobody was even remotely curious about bad performance. They just accepted bad performance as if that's just the way the world is.

The problem is, you don't get rewarded for curiosity, for digging down into problem heaps, or for straying out of line. To the contrary, you'll often enough get punished for not fulfilling your quota.

LarMachinarum 3 days ago | parent | prev | next [-]

> and anyone with an ounce of curiousity would be like, ".........wait a minute"

I see what you did there ;)

Ygg2 3 days ago | parent | prev | next [-]

> Another datapoint that supports your argument is the Grand Theft Auto Online (GTAO) thing a few months ago.[0] GTAO took 5-15 minutes to start up. Like you click the icon and 5-15 minutes later you're in the main menu. Everyone was complaining about it for years.

I see this is a datapoint, but not for your argument. This thing sat in the code base didn't cause problems and didn't affect sales of the game pre or post GTAO launch.

This sounds a lot like selection bias. You want to enhance airplanes that flew and returned. Rather than those that didn't come back.

Let's say they did the opposite and focused on improving this over a feature or a level from GTA. What level or what feature that you liked could you remove to make way for investigating and fixing this issue? Because at the end of the day - time is zero-sum. Everything you do comes at the expense of everything you didn't.

pjc50 3 days ago | parent | next [-]

This is the sort of thing that, if fixed early enough in the development cycle, actually net brings forwards development. Because every time someone needs to test the game they hit the delay.

(which makes it all the more strange that it wasn't fixed)

Ygg2 3 days ago | parent [-]

> This is the sort of thing that, if fixed early enough in the development cycle

Is it? It didn't become noticable until GTA got a bunch of DLCs.

Sure someone might have spotted it. But it would take more time to spot it early, and that time is time not spent fixing bugs.

godelski 3 days ago | parent | prev [-]

I think you have the logic backwards. You are saying it didn't cause problems, right? Well that's the selection bias. You're basing your assumption on what is more easily measurable. It's "not a problem" because it got sales, right? Those are the planes that returned.

But what's much harder to measure is the number of sales you missed. Or where the downed planes were hit. You don't have the downed planes, you can't see where they were hit! You just can't have that measurement, you can only infer the data through the survivors.

  > Because at the end of the day - time is zero-sum
Time is a weird thing. It definitely isn't zero sum. There's an old saying from tradesmen "why is there always time to do things twice but never time to do things right?" Time is made. Sometimes spending less time gives you more time. And all sorts of other weird things. But don't make the classic mistake of rushing needlessly.

Time is only one part of the equation and just like the body the mind has stamina. Any physical trainer would tell you you're going to get hurt if you just keep working one group of muscles and keep lifting just below your limit. It's silly that the idea is that we'd go from sprint to sprint. The game industry is well known to be abusive of its developers, and that's already considering the baseline developer isn't a great place to start from, even if normalized.

Ygg2 3 days ago | parent [-]

> But what's much harder to measure is the number of sales you missed. Or where the downed planes were hit. You don't have the downed planes, you can't see where they were hit! You just can't have that measurement, you can only infer the data through the survivors.

Not really. There are about 300 million gamers [1] if you exclude Androids and iPhones. How many sales units did GTA V make? 215 million[2]. It's a meteoric hit. They missed a sliver (35%) of their target audience.

You could argue that they missed the mobile market. But the biggest market - Android is a pain to develop for; the minimum spec for GTA V to have parity on phones would exclude a large part of the market (most likely), and the game itself isn't really mobile-friendly.

Ok, but we have a counter example (pun intended). Counter-Strike. Similarly, multiplayer, targets PCs mostly, developed by Valve, similarly excellent and popular to boot. However, it's way faster and way better optimized. So how much it "sold" according to [3]? 70 million. 86 if you consider Half-Life 1 and 2 as its single player campaign.

I'm not sure what the deciding factor for people is, but I can say it's not performance.

> Time is a weird thing. It definitely isn't zero sum.

If you are doing thing X, you can't do another thing Y, unless you are multitasking (if you are a time traveler, beware of paradoxes). But then you are doing two things poorly, and even then, if you do X and Y, adding other tasks becomes next to impossible.

It definitely is. Tim Cain had a video[4] about how they spent man months trying to find a cause for a weird foot sliding bug, that's barely noticeable, which they managed so solve. And at that time Diablo came out and it was a massive success with foot sliding up the wazoo. So, just because it bugs you doesn't mean others will notice.

> "why is there always time to do things twice but never time to do things right?"

Because you're always operating with some false assumption. You can't do it right, because the right isn't fixed and isn't always known, nor is it specified right for whom?

[1]https://www.pocketgamer.biz/92-of-gamers-are-exclusively-usi...

[2]https://web.archive.org/web/20250516021052/https://venturebe...

[3]https://vgsales.fandom.com/wiki/Counter-Strike

[4]https://youtu.be/gKEIE47vN9Y?t=651

godelski 2 days ago | parent [-]

  > They missed a sliver (35%) of their target audience.
Next time you're at a party go take a third of the cake and then tell everyone you just took "a sliver". See what happens...

Honestly, there's no point in trying to argue with you. Either you're trolling, you're greatly disconnected from reality, or you think I'm brain dead. No good can come from a conversation with someone that is so incorrigible.

Ygg2 2 days ago | parent [-]

> Next time you're at a party go take a third of the cake and then tell everyone you just took "a sliver". See what happens...

Fine, I'll concede it's the wrong word used. But:

> Honestly, there's no point in trying to argue with you. Either you're trolling, you're greatly disconnected from reality

Wait. I'm disconnected? Selling millions of unit (Half life) is amazing success and tens of millions is stellar success by any measure (Baldur's Gate, Call of Duty, Skyrim). But selling hundreds of millions (Minecraft, GTAV)? That's top 10 most popular game of all time.

So according to you, one of the top 5 best-selling game in history is somehow missing a huge part of the market? You can argue a plethora of things, but you can't speculate that GTA V could have done much better by saying "you're trolling"/"no point arguing".

And saying that optimizing the DLC JSON loader could have given them a bigger slice of the pie is incredulous at best.

You're extrapolating your preferences to 6 billion people. It's like watching a designer assume everyone will notice they used soft kerning, with dark grey font color on a fishbone paper background for their website. And that they cleverly aligned the watermark with the menu elements.

Sophira 3 days ago | parent | prev [-]

Honestly the GTA5 downloader/updater itself has pretty bad configuration. I wrote a post about it on Reddit years ago along with how to fix it.

I don't know if it's still applicable or not because I haven't played it for ages, but just in case it is, here's the post: https://www.reddit.com/r/GTAV/comments/3ysv1d/pc_slow_rsc_au...

jmmv 3 days ago | parent | prev | next [-]

> Grab a fucking profiler, it is more important than your Big O

This is exactly why I wrote https://jmmv.dev/2023/09/performance-is-not-big-o.html a few years back. The focus on Big O during interviews is, I think, harmful.

godelski 3 days ago | parent | next [-]

I think you're right. Early on I did HPC and scientific computing. No one talked about Big O. Maybe that's because a lot of people were scientists, but still, there was a lot of focus on performance. Really how people were optimized is with a profiler. You talk about the type of data being processed and how and looked for the right way to do things based on this and people didn't do the reductions and simplifications in Big O.

Those simplifications are harmful when you start thinking about parallel processing. There's things you might want to do that would look silly in serial process. O(2n) can be better than O(n) because you care about the actual functions. Let's say you have a loop and you do y[i] = f(x[i]) + g(x[i]). If f and g are heavy then you may want to split this out into two loops y[i] += f(x[i]) and y[i] += g(x[i]) since these are associative (so non-blocking).

Most of the work was really about I/O. Those were almost always the bottlenecks. Your Big O won't help there. You gotta write things with awareness about where in memory it is and what kind of memory is being used. All about how you break things apart, operate in parallel, and what you can run asynchronously.

Honestly, I think a big problem these days is that we still operate as if a computer has 1 CPU and only one location for memory.

mjevans 2 days ago | parent | prev [-]

What more harmful is probably not having a set of guilds / unions (that both work together, share the same collective bargains, but also compete for members) to cut a lot of the annoying for ALL sides involved interview process out.

Why do they ask about Big O? Because it works as a filter. That's how bad some of the candidates are.

What would I rather they do? Have a non-trivial but obviously business unrelated puzzle that happens to include design flaws and the interviewee is given enough time and latitude to 'fulfill the interface, but make this the best it can be'.

AdieuToLogic 3 days ago | parent | prev | next [-]

> People asking for performance aren't pissed you didn't write Microsoft Word in assembly we're pissed it takes 10 seconds to open a fucking text editor.

It could be worse I suppose...

Some versions of Microsoft Excel had a flight simulator embedded in them[0]!

:-D

0 - https://web.archive.org/web/20210326220319/https://eeggs.com...

genewitch 3 days ago | parent | prev | next [-]

is there a windows profiler that i can use on microsoft binaries to see what the hold-up is? I think 15 years ago i used valgrind and i cannot remember for what. Either way, there's a ton of stuff i want to report, either to microsoft (they won't care), but the internet might.

i've managed to track down powershell in windows terminal taking forever to fully launch down to "100% gpu usage in the powershell process", but i'd really like to know what it thinks it's doing.

also: 4 seconds to blank document in Word. the splash screen is most of that. notepad++ ~2 seconds. notepad.exe slightly over 1 second before the text content loads. Edge: 2 seconds to page render from startup. Oh and powershell loads instantly if i switch to the "old" terminal, but the old terminal doesn't have tabs, so that's a non-starter. "forever" above means 45-60 seconds.

p_l 2 days ago | parent [-]

The splash screens usually "hide" various loading steps. In Excel it's often loading and initialization of various extensions, for example.

saagarjha 3 days ago | parent | prev | next [-]

Your computer is broken. My M1 Pro launches it to user interactive in less than two seconds. And, to be clear, I launched it in a profiler. I suggest you do the same on your machine and find out why it's taking that long.

tom_ 3 days ago | parent | next [-]

Maybe it's phoning home to verify the app, or whatever it is it does? Launch times for MS Word on my 11 year old Macbook Pro, approx time to the opening dialog:

First run since last reboot: 19 seconds

Second run: 2.5 seconds

Third run after sudo purge: 7 seconds

Maybe it's an artefact of where I live, but the verify step always takes ages. First run of anything after a reboot takes an outlandish amount of time. GUI stuff is bad; Unix-type subprocess-heavy stuff is even worse. Judging by watching the Xcode debugger try to attach to my own programs, when they are in this state, this is not obviously something the program itself is doing.

godelski 3 days ago | parent [-]

I think you're right. I rarely use word and so it was definitely running "cold"

I went ahead and did another run and it was much faster. About 2 seconds. So things are definitely being cached. I did a trace on it (Instruments) and there's a lot of network activity. Double the time after sudo purge. There's 2 second of network time where the previous run only spent 1 second. Ran a tad faster when I turned the network off, though ended up using more CPU power.

FWIW, looks to be only using 4 of my 8 cores, all of which are performance cores. Also looks like it is fairly serialized as there's not high activation on any 2 cores at the same time. Like I'll see one core spike, drop, and then another core spike. If I'm reading the profiler right then those are belonging to the same subprocesses and just handing over to a different thread.

For comparison, I also ran on ghostty and then opened vim. Ghostty uses the performance cores but very low demand. vim calls the efficiency cores and I don't see anything hit above 25% and anytime there's a "spike" there's 2, appearing across 2 cores. Not to mention that ghostty is 53MB and nvim is more than a magnitude less. Compared to Word's 2.5GB...

I stand by my original statement. It's a fucking text editor and it should be nearly instantaneous. As in <1s cold start.

saagarjha 2 days ago | parent [-]

I think even 1s is generous, of course. I'm just saying it doesn't actually take 10.

tengwar2 2 days ago | parent | prev [-]

Are we talking about Word, or a text editor? They seem to be saying the latter, particularly given the comparison with vi. I consistently get about half a second to open TextEdit on an M1, and that seems to be due to the opening animation.

tekknik 2 days ago | parent | prev | next [-]

you’re asking for people to care about something they do a few times a day and further asking people to devote time to this. it’s ok if you feel this is important but as a developer for over 15 years i don’t care if my text editor takes 10 seconds to start as i have other things starting at the same time that takes longer.

or put another way, if you care about text editor performance, or are hyper focused on performance in all cases, you miss the point of software development

1vuio0pswjnm7 3 days ago | parent | prev [-]

"I literally timed it on my M2 Air."

I bet it opens faster on a Surface Pro

justsid 3 days ago | parent | next [-]

It does not. In fact, it crashes roughly every 4th or so startup.

1vuio0pswjnm7 3 days ago | parent [-]

Yikes. I'm glad I do not use Windows anymore

3 days ago | parent | prev | next [-]
[deleted]
godelski 3 days ago | parent | prev [-]

I mean we're talking about a fucking text editor here. A second to load is a long time even if it was on an intel i3 from 10 years ago. Because... it is a text editor... Plugins and all the fancy stuff is nice, but those can be loaded asynchronously and do not need to prevent you from jumping into a blank document.

But the god damn program is over 2GB in size... like what the fuck... There's no reason for an app I open a few times a year and have zero plugins and ONLY does text editing should even be a gig.

Seriously, get some context before you act high and mighty.

I don't know how anyone can look at Word and think it is anything but the accumulation of bloat and tech debt piling up. With decades of "it's good enough" compounding and shifting the bar lower as time goes on.

versteegen 3 days ago | parent | next [-]

As a long time emacs user, all of that criticism hits uncomfortably close to home, much as I would like to diss Word...

sintax 3 days ago | parent [-]

Try doom emacs, that loads super fast.

fkyoureadthedoc 3 days ago | parent | prev | next [-]

Nobody gives a shit because apparently MS is the only company that can make a "fucking text editor" that people actually want to use.

godelski 3 days ago | parent [-]

  > people actually want to use
I think this is an incorrect assumption.

Just because people use it doesn't mean they want to use it. We're in a bubble here and most people are pretty tech illiterate. Most people don't even know there are other options.

Besides, it also misses a lot. Like how there's a lot of people that use Google Docs. Probably the only alternative an average person is aware of. But in the scientific/academic community nearly everyone uses LaTeX. They might bemoan and complain but there's a reason they're using it over Word and TeX isn't 2.5GB...

1vuio0pswjnm7 3 days ago | parent | prev [-]

In earlier times, before Google or OS X even existed, long before "automatic updates", it used to be own experience that Microsoft's pre-installed Windows programs would run generally faster and with fewer issues (read: none) than third party software that a Windows user would install. This was also the case with software downloaded from Microsoft that a Windows user might install. Hence I thought perhaps MS Word today might run smoother on a Microsoft laptop than an Apple laptop. I'm not a Windows user anymore so I have no idea.

For reading, editing, creating Word documents, the TextMaker Android app seems to work. Size of recent version I tried was 111MB, startup is quick. Paid features are non-essential IMHO.

https://www.softmaker.net/down/tm2024manual_en.pdf

A personal favourite program for me is spitbol, a SNOBOL interpreter written in a portable assembly language^3 called MINIMAL. I'm using a 779k static binary. SNOBOL is referenced here:

1. https://borretti.me/article/you-can-choose-tools-that-make-y...

The Almquist shell is another favourite of mine. It's both the "UI" and the language I use everyday to get stuff done on the computer. Like Microsoft Word makes some HN commenters unhappy, it seems that the UNIX shell makes some "developers" unhappy.^2

But the shell is everywhere and it is not going away.

IME, old software from an era before so-called "tech" companies funded by VC and "ad services", still works and I like it. For example, sed is still the favourite editor for me despite its limitations, and it is close to the shell in terms of ubiquity. Following current trends requires devoting more and more storage, memory and CPU in order to wait for today's programs to compile, start, or "update". As a hobbyist who generally ignores the trends I am experiencing no such resource drains and delays.

For every rare software user complaining about bloat, there is at the same time a "developer", e.g., maybe one writing a Javascript engine for a massive web browser controlled by an advertising company, who is complaining about the UNIX shell.

Developers like to frame software as a popularity contest. The most popular software is declared to have "won". (Not sure what that means about all the other software. Maybe not all software authors are interested in this contest.) To me, ubiquity is more important than "popularity":

2. https://borretti.me//article/shells-are-two-things

"There are 5,635 shell scripts on my humble Ubuntu box."

This makes me happy.

On Linux, I use vim 4.6 from 1997, a 541k static-pie binary. I use ired, bvi and toybox hexedit as hex editors, 62k, 324k and 779k static-pie binaries, respectively. If I dislike something I can change it in the source code. If I find other software I like better I can switch. No closed source and proprietary file formats like MS Word. The most entertaining aspect of the cult of Microsoft is that the company is so protective of software that it tells the public is "obsolete" or otherwise not worth using anymore, old versions of software or minimal versions that have few "features".

https://ftp.nluug.nl/pub/vim/unix/vim-4.6.tar.gz

https://codeload.github.com/radare/ired/zip/refs/heads/maste...

https://codeload.github.com/johnsonjh/bvi-lf/zip/refs/heads/...

https://www.landley.net/toybox/downloads/toybox-0.8.9.tar.gz

https://www.landley.net/toybox/downloads/binaries/latest/toy...

3. The topic of this thread is assembly language. Makes me happy

The appeal of smaller, faster software to me is not that this stuff is so _good_. It is that the alternatives, software like MS Word, is so _bad_.

1vuio0pswjnm7 3 days ago | parent [-]

The character editor TECO-C is another program I like that was originally written in assembly. This is a 153k static-pie binary ("non-video")

https://codeload.github.com/blakemcbride/tecoc/zip/refs/head...

Also forgot to mention that TextMaker on Android contains networking code and will try to connect to the internet. This is can be blocked with Netguard app, GrapheneOS, netfilter/pf/etc. on a gateway controlled by the user, or whatever other solution for blocking connections that the user prefers.