| ▲ | bestham 13 hours ago |
| “A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.”
Gall’s Law |
|
| ▲ | jandrewrogers 2 hours ago | parent | next [-] |
| Some systems require a total commitment to the complexity because it is intrinsic. There is no "simple" form that also works, even if poorly. In many contexts, "systems thinking" is explicitly about the design of systems that are not reducible to simpler subsystems, which does come up in many types of engineering. Sometimes you have to eat the whole elephant. There is a related phenomenon in some types of software where the cost of building an operational prototype asymptotically converges on the cost of just writing the production code. (This is always a fun one to explain to management that think building a prototype massively reduces delivery risk.) |
| |
| ▲ | pardon_me 28 minutes ago | parent [-] | | This is the point we are at now with wide-scale societal technologies; combining the need for network effects with the product being the prototype, and no option but to work on the system live. Some projects have been forced so far, by diverting resources (either public-funded or not-yet-profitable VC money), but these efforts have not proven to be self-sustaining. Humans will be perpetually stuck where we are as a species if we cannot integrate the currently opposing ideas of up-front planning vs. move fast and break things. Society is slowly realizing the step-change in difficulty between projects in controlled conditions that can have simplified models to these irreducibly complex systems. Western doctors are facing an interesting parallel, now becoming more aware to treat human beings in the same way--that we emerge as a result of parts which can be simplified and understood, but could never describe the overall system behavior. We are good examples of the intrinsic fault-tolerance required for such systems to remain stable. |
|
|
| ▲ | nasretdinov 10 hours ago | parent | prev | next [-] |
| I think the important part here is "from scratch". Typically when you're designing a new (second, third, whatever) system to replace the old one you actually take the good and the bad parts of the previous design into account, so it's no longer from scratch. That's what allows it to succeed (at least in my experience it usually did). |
| |
| ▲ | bluGill 4 hours ago | parent [-] | | These days software has been done a lot. You should be able to find others who have done similar things and learn lessons from them. Considering microservices - there are lots of people who have done them and can tell you what worked well and what didn't. Considering using QT - lots of others have and can give you ideas. Considering writing your own framework - there are lots of others: look at what they do good and bad. If you are doing a CRUD web app for a local small business - there are thousands of examples. If you are writing control software for a space station - you may not have access to code from NASA/Russia/China but you can at least look at generic software that does the things you need and learn some lessons. |
|
|
| ▲ | codeflo 10 hours ago | parent | prev | next [-] |
| This is often quoted, but I wonder whether it's actually strictly true, at least if you keep to a reasonable definition of "works". It's certainly not true in mechanical engineering. |
| |
| ▲ | bestham 9 hours ago | parent | next [-] | | The definition of a complex system is the qualifier for the quote. Many systems that are designed, implemented and found working are not complex systems. They may be complicated systems. To paraphrase Dr. Richard I. Cook’s ”How Complex Systems Fail” where he claims that complex systems are inherently hazardous, operate near the edge of failure and cannot be understood by analyzing individual components. These systems are not just complicated (like a machine with fixed parts) but dynamic, constantly evolving, and prone to multiple, coincidental failures. A system of services that interact, where many of them are depending on each other in informal ways may be a complex system. Especially if humans are also involved. Such a system is not something you design. You just happen to find yourself in it. Like the road to hell, the road to a complex system is paved with good intentions. | | |
| ▲ | codeflo 9 hours ago | parent [-] | | Then what precisely is the definition of complex? If "complex" just means "not designed", then the original quote that complex systems can't be designed is true but circular. If the definition of "complex" is instead something more like "a system of services that interact", "prone to multiple, coincidental failures", then I don't think it's impossible to design them. It's just very hard. Manufacturing lines would be examples, they are certainly designed. | | |
| ▲ | marcosdumay an hour ago | parent | next [-] | | A complex system is one that has chaotic behavior. (And no, this is not "my" definition, it's how it's defined in the systems-related disciplines.) | |
| ▲ | estearum 6 hours ago | parent | prev [-] | | The manufacturing lines would be designed, and they'd be designed in an attempt to affect the "design" of the ultimate resulting supply chain they're a part of. But the relationship between the design of some lines and the behavior of the larger supply chain is non-linear, hard to predict, and ultimately undesigned, and therefore complex. The design of the manufacturing lines and the resulting supply chain are not independent of each other -- you can trace features from one to the other -- but you cannot take apart the supply chain and analyze the designs of its constituent manufacturing lines and actually predict the behavior of the larger system. AFAIK there's not a great definition of a complex system, just a set of traits that tend to indicate you're looking at one. Non-linearity, feedbacks, lack of predictability, resistance to analysis (the "you can't take it apart to reason about the whole" characteristic mentioned above"). All of these traits are also kind of the same things... they tend to come bundled with one another. |
|
| |
| ▲ | narag 7 hours ago | parent | prev | next [-] | | IMHO, the key is where you add complexity. In software you have different abstraction layers. If you make a layer too fat, it becomes unwieldly. A simple system evolves well if you're adding the complexity in the right layer, avoiding making a layer responsible for task outside its scope. It still "works" if you don't, but it's increasingly difficult to maintain it. The law is maybe a little too simplistic in its formulation, but it's fundamentally true. | |
| ▲ | direwolf20 7 hours ago | parent | prev | next [-] | | You built this gear using the knowledge from your last gear. You didn't start with no knowledge, read a manual on operating a lathe, grab a hunk of metal and make a perfect gear the first time. | |
| ▲ | laserlight 7 hours ago | parent | prev [-] | | > It's certainly not true in mechanical engineering. Care to exemplify? |
|
|
| ▲ | jackblemming 10 hours ago | parent | prev | next [-] |
| People misinterpret this and think they can incrementally build a skyscraper out of a shed. |
| |
| ▲ | sph 6 hours ago | parent | next [-] | | That's exactly why software is so bad. No one ever knows their shed would ultimately have to become a skyscraper, and management doesn't allocate any budget to lay stronger foundations when expectations change; you make do with what you have. See also: "there is nothing more permanent than a temporary solution" | |
| ▲ | saulpw 7 hours ago | parent | prev | next [-] | | That's what happened though? First humans built sheds, then we built 2-story buildings, then taller and taller, until we built skyscrapers. Obviously it wasn't a single structure, but we did have to evolve our thinking on how to build things, we didn't just start building a skyscraper before we built a shed. | | |
| ▲ | bluGill 4 hours ago | parent | next [-] | | You can't do that. A small bike shed is often just put some concrete blocks on the ground, and then build on top of them with wood. A correct house needs a stronger foundation at higher costs (sheds larger than bike shed are build the same way), but is still made of wood. A skyscraper is built with a very different foundation, and needs a steel frame that would not be affordable in a house. In between the two there are also building made of brick which allows building taller than wood. (and there are lots of other options with different costs - engineered wood is different) Point is though eventually some system runs out of ability. It works different in programming from physical construction, but the concept is the same, eventually you can't make a bad early design work anymore. | |
| ▲ | GuinansEyebrows 14 minutes ago | parent | prev | next [-] | | to put it in another way than the other replies: you will have 100x more pushback to an arguably-necessary ground-up rewrite instead of "just add this new feature to the existing codebase", even when you (as an engineer) know full well why "just adding a feature" is probably a bad idea. | |
| ▲ | speed_spread 7 hours ago | parent | prev [-] | | But you didn't upgrade the shed into a skyscraper. The iterative process you describe involves a human respecifying from scratch using the knowledge developed building the previous instance and seeing it's limitations first hand. That part can't be automated, no LLM is going to challenge your design assumptions by itself. Hence people pushing agent-built projects way past what their inherent architecture should support, delivering an unmaintainable code spaghetti. |
| |
| ▲ | narag 7 hours ago | parent | prev | next [-] | | You can't physically but the logic is the same: you need beams, foundations, walls and roofs, with strenghts adjusted for scale. Software mindset :-) In this sense, web applications haven't changed so much in the last twenty years: client, server, database... | |
| ▲ | cpursley 8 hours ago | parent | prev [-] | | It’s actually the opposite - you actually can. The feel I'm getting reading anti-AI sentiment is people are expect one shot results out of limited context. | | |
| ▲ | WJW 8 hours ago | parent [-] | | I'm pretty sure that you can't gradually upgrade a shed into a skyscraper unless you pour a skyscraper-ready foundation before even starting on the shed. But if you're doing that, why start with a shed and not with a skyscraper? Not sure why you're trying to bring AI development into this. |
|
|
|
| ▲ | McGlockenshire 12 hours ago | parent | prev | next [-] |
| Ah, the Second System Effect, and the lesson learned from it. |
| |
| ▲ | jeffreygoesto 12 hours ago | parent [-] | | But this is about the first systems? I tend to tell people, the fourth try usually sticks. The first is too ambitious and ends in an unmaintainable pile around a good core idea. The second tries to "get everything right" and suffers second system syndrome. The third gets it right but now for a bunch of central business needs. You learned after all. It is good exactly because it does not try to get _everything_ right like the second did. The fourth patches up some more features to scoop up B and C prios and calls it a day. Sometimes, often in BigCorp:
Creators move on and it will slowly deteriorate from being maintenaned... |
|
|
| ▲ | YZF 13 hours ago | parent | prev | next [-] |
| So true. |
|
| ▲ | smitty1e 8 hours ago | parent | prev [-] |
| Came here to add this very comment. https://en.wikipedia.org/wiki/John_Gall_(author)#Gall's_law |