▲ | dfabulich 3 days ago | |
No, seriously, you literally never need to understand the operations as algebra. You just need to know how to use your language's type system. None of the popular languages call them sum types, product types, or quotient types in their documentation. In TypeScript, it's called a "union type," and it uses a | operator. In Python, `from typing import Union`. In Kotlin, use sealed interfaces. In Rust, Swift, and PHP, they're enums. There are subtle differences between each language. If you switch between languages frequently, you'll need to get used to the idiosyncrasies of each one. All of them can implement the pattern described in the OP article. Knowing the language of algebraic types doesn't make it easier to switch between one popular language to another; in fact, it makes it harder, because instead of translating between Python and TypeScript or Python and Rust, you'll be translating from Python to the language of ADTs (sum types, tagged unions, untagged unions) and then from the language of ADTs to TypeScript. People screw up translating from TypeScript to ADTs constantly, and then PL nerds argue about whether the translation was accurate or not. ("Um, actually, that's technically a tagged union, not a sum type.") The "competitor" is to simply not use the language of algebraic data types when discussing working code. The competitor has won conclusively in the marketplace of ideas. ADT lingo has had decades to get popular. It never will. ADT lingo expresses no deeper truth. It's just another language, one with no working compiler, no automated tests, and no running code. Use it if you enjoy it for its own sake, but you've gotta accept that this is a weird hobby (just like designing programming languages) and it's never going to get popular. | ||
▲ | dkarl 3 days ago | parent | next [-] | |
If you're so confident that algebraic data types will never become popular, I don't understand why you feel it's so important to convince a few nerds not to talk about them. I don't think anyone here is advocating making the topic compulsory. There will always be some people who are interested in theory and others who aren't. | ||
▲ | WorldMaker 2 days ago | parent | prev [-] | |
> None of the popular languages call them sum types, product types, or quotient types in their documentation. It seems like you may want to spend a bit more time with the ML family of languages. Sure, you can argue the degree of what constitutes "popular" but OCaml and F# routinely make statistics on GitHub and Stack Overflow. The ML family will represent tuples as products directly (`A * B`), and sometimes use plus for sum types, too (though yes `|` is more popular even among the ML family), which they do call sum types. > Knowing the language of algebraic types doesn't make it easier to switch between one popular language to another The point of ADTs is not to build a universal type system but to generalize types into Set thinking. That does add some universals that a sum type should act like a sum type and a product type should act like a product type, but not all types in any type system are just sum types or just product types. ADT doesn't say anything about the types in a type system only that there are standard ways to combine them. It says there are a couple of reusable operators to "math" types together. > ADT lingo expresses no deeper truth. Haskell has been exploring the "deeper truths" of higher level type math for a long while now. Things like GADTs explore what happens when you apply things like Category Theory back on top of Algebraic Data Types, that because you have a + monoid and * monoid, what sorts of Monads do those in turn describe. Admittedly yes, a lot of those explorations still feel more academic than practical (though I've seen some very practical uses of GADTs), but just because the insights for now are mostly benefiting "the Ivory Tower" and "Arch Haskell Wizards" doesn't mean that they don't exist. The general trend of this sort of thing is that first academia explores it, then compilers start to use it under the hood of how they compile, then the compiler writers find practical versions of it to include inside their own languages. That seems to be the trend in motion already. Also, just because it's a relatively small audience building compilers doesn't mean we don't all benefit from the things they explore/learn/abstract/generalize. We might not care to know the full "math" of it, but we still get practical benefits from using that math. If a compiler author does it right, you don't need to know mathematical details and jargon like "what is a product type", you indeed just use the fruits of that math like "Tuples". The math improves the tools, and if the language is good to you, the math provides you with more tools and more freedom to generalize your own algorithms beyond simple types and type constructs, whether or not you care to learn the math. (That said, it does seem like a good idea to learn the type math here. ADTs are less confusing than they sound.) |