| ▲ | xg15 6 days ago |
| I'm starting to wonder if some of those bad design decisions are symptoms of a larger "cultural bias" at Google. Specifically the "No Compositionality" point: It reminds me of similar bad designs in Go, CSS and the web platform at large. The pattern seems to be that generalized, user-composable solutions are discouraged in favor of a myriad of special constructs that satisfy whatever concrete use cases seem relevant for the designers in the moment. This works for a while and reduces the complexity of the language upfront, while delivering results - but over time, the designs devolve into a rats's nest of hyperspecific design features with awkward and unintuitive restrictions. Eventually, the designers might give up and add more general constructs to the language - but those feel tacked on and have to coexist with specific features that can't be removed anymore. |
|
| ▲ | senorrib 6 days ago | parent | next [-] |
| It works both ways. General constructs tend to become overly abstract and you end up with sneaky errors in different places due to a minor change to an abstraction. Like the old adage, this is just a matter of preference. Good software engineering requires, first and foremost, great discipline, regardless of the path or tool you choose. |
| |
| ▲ | gettingoverit 6 days ago | parent [-] | | If there are errors in implementation of general constructs, they tend to be visible at their every use, and get rapidly fixed. Some general constructs are better than the others, because they have an algebraic theory behind them, and sometimes that theory was already researched for a few hundred years. For example, product/coproduct types mentioned in the article are quite close to addition and multiplication that we've all learned in school, and obey the same laws. So there are several levels where the choice of ad-hoc constructs is wrong, and in the end the only valid reason to choose them is time constraints. If they had 24 years to figure out how to do it properly, but they didn't, the technology is just dead. | | |
| ▲ | sdenton4 6 days ago | parent [-] | | Hm, that's idealistic... I've certainly run into cases where small changes in general systems led to hard-to-detect bugs, which took a great deal of investigation to figure out. Not all failures are catastrophic. The technology is quite alive, which is why it hasn't been 'fixed' - changing the wheels on a moving car, and all that. The actual disappointment is that a better alternative hasn't taken off in the six years since this post was written... If its so easy, where's the alternatives? | | |
| ▲ | gettingoverit 2 days ago | parent [-] | | That's not idealistic, that's how arithmetics work. If you use the same generic thing more times, you have the higher chance of discovering it broken. The fact that you've run into cases means that chance is never zero, and is irrelevant to the discussion. As was already mentioned in the article, PB solve a problem that likely only Google has, even if that. State of the art nowadays is JSON/JSONL. If it grows too large, gzip it. When someone is using third-party closed proprietary technologies to be "not like the rest", it usually doesn't work that well for their business. The technology is "alive" until it didn't follow the path of Closure, GWT, and the rest of "we use it on the most loaded page of the world" technology. PB will be on the same graveyard soon. |
|
|
|
|
| ▲ | lelanthran 5 days ago | parent | prev [-] |
| > This works for a while and reduces the complexity of the language upfront, while delivering results - but over time, the designs devolve into a rats's nest of hyperspecific design features with awkward and unintuitive restrictions. But that's true for almost anything, though. |