| ▲ | zelphirkalt 2 days ago | |||||||
> Mathematicians do care about how much "black magic" they're invoking, and like to use simple constructions where possible (the field of reverse mathematics makes the central object of study). For example, Wiles' initial proof of Fermat's last theorem used quite exotic machinery called "inaccessible cardinals", which lie outside of ZFC. Subsequent work showed they weren't needed. In a way mathematicians can afford to do this more readily than people in software development, because if something is actually proven, then you can 100% rely on that. With software not so much. Or rather: Software usually is not proven to be correct, because that's usually expensive. In mathematics they don't have to consider the runtime of an algorithm, when they "merely" need to prove correctness. The time it needs to run is irrelevant for its correctness. And so they can stack and stack and stack, provided that each piece is proven correct, and it won't have negative consequences. Well, almost. There is some negative consequence in that another human being, wanting to understand a proof, needs to know perhaps many concepts and other proofs, in order to be able to do so. But that's probably the only reason to pursue simplicity in mathematics. | ||||||||
| ▲ | almostgotcaught a day ago | parent [-] | |||||||
> The time it needs to run is irrelevant for its correctness. And so they can stack and stack and stack This is a very naive take - the very direct translation of what you're saying doesn't happen does in happen in analysis all the time: there are many inequalities which can be "stacked" to prove a bound on something but their factors are too large so you cannot just stack them if you need a fixed bound for your proof to go through. Unsurprisingly this is exactly how actual runtime analysis also works (it's unsurprising because they're both literally math). | ||||||||
| ||||||||