▲ | exe34 6 hours ago | |||||||
some of us think in those terms and daily have to fight those who want 20 different objects, each 5-10 deep in inheritance, to achieve the same thing. I wouldn't say 100 functions over one data structure, but e.g. in python I prefer a few data structures like dictionary and array, with 10-30 top level functions that operate over those. if your requirements are fixed, it's easy to go nuts and design all kinds of object hierarchies - but if your requirements change a lot, I find it much easier to stay close to the original structure of the data that lives in the many files, and operate on those structures. | ||||||||
▲ | TuringTest 2 hours ago | parent [-] | |||||||
Seeing that diamond metaphor, and then learning how APL sees "operators" as building "functions that are variants of other functions"(1), made me think of currying and higher-order functions in Haskell. The high regularity of APL operators, which work the same for all functions, force the developer to represent business logic in different parts of the data structure. That was a good approach when it was created; but modern functional programming offers other tools. Creating pipelines from functors, monads, arrows... allow the programmer to move some of that business logic back into generic functions, retaining the generality and capacity of refactoring, without forcing to use the structure of data as meaningful. Modern PL design has built upon those early insights to provide new tools for the same goal. (1) https://secwww.jhuapl.edu/techdigest/content/techdigest/pdf/... | ||||||||
|