| ▲ | wat10000 21 hours ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Programs can be very close to 100% reliable when made well. In my life, I've never seen `sort` produce output that wasn't properly sorted. I've never seen a calculator come up with the wrong answer when adding two numbers. I have seen filesystems fail to produce the exact same data that was previously written, but this is something that happens once in a blue moon, and the process is done probably millions of times a day on my computers. There are bugs, but bugs can be reduced to a very low level with time, effort, and motivation. And technically, most bugs are predictable in theory, they just aren't known ahead of time. There are hardware issues, but those are usually extremely rare. Nothing is 100% predictable, but software can get to a point that's almost indistinguishable. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | stavros 20 hours ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> Programs can be very close to 100% reliable when made well. This is a tautology. > I've never seen a calculator come up with the wrong answer when adding two numbers. > And technically, most bugs are predictable in theory, they just aren't known ahead of time. When we're talking about reliability, it doesn't matter whether a thing can be reliable in theory, it matters whether it's reliable in practice. Software is unreliable, humans are unreliable, LLMs are unreliable. To claim otherwise is just wishful thinking. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | mrguyorama 15 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
>I've never seen a calculator come up with the wrong answer when adding two numbers. Intel once made a CPU that barely got some math wrong that probably would not affect the vast majority of users. The backlash from the industry was so strong that intel spent half a billion (1994) dollars replacing all of them. Our entire industry avoids floating point numbers for some types of calculations because, even though they are mostly deterministic with minimal constraints, that mental model is so hard to manage that you are better off avoiding it entirely and removing an entire class of errors from your work But now we are just supposed to do everything with a slot machine that WILL randomly just do the wrong thing some unknowable percentage of the time, and that wrong thing has no logic? No, fuck that. I don't even call myself an engineer and such frivolity is still beyond the pale. I didn't take 4 years of college and ten years of hard earned experience to build systems that will randomly fuck over people with no explanation or rhyme or reason. I DO use systems that are probabilistic in nature, but we use rather simple versions of those because when I tell management "We can't explain why the model got that output", they rightly refuse to accept that answer. Some percentage of orders getting mispredicted is fine. Orders getting mispredicted that cannot be explained entirely from their data is NOT. When a customer calls us, we cannot tell them "Oh, that's just how Neural networks are, you were unlucky". Notably, those in the industry that HAVE jumped on the neural net/"AI" bandwagon for this exact problem domain have not demonstrated anything close to seriously better results. In fact, one of our most DRAMATICALLY effective signals is a third party service that has been around for decades, and we were using a legacy integration that hadn't been updated in a decade. Meanwhile, Google's equivalent product/service couldn't even match the results of internally developed random forest models from data science teams that were.... not good. It didn't even match the service Microsoft has recently killed, which was similarly bragadocious about "AI" and similarly trash. All that panopticon's worth of data, all that computing power, all that supposed talent, all that lack of privacy and tracking, and it was almost as bad as a coin flip. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||