| ▲ | dominicrose 2 days ago | |
I've used Clojure/Script in the past and it's good to enforce working with immutable data. Immutable.js has similar data structures but it's not the standard way of doing things and uglier to debug. Using standard objects, immutability is not enforced in JS and throwing a few Object.freeze calls won't change that so we lose half the benefits that Clojure would bring: parallel/concurrent programming benefits, easier deep equality checks, performance... If the code is not performance sensitive and can run in a single thread, simply "not mutating some of the mutable data" is a start for someone interested in immutability. That's what ramdajs does, it doesn't invent new data structures or freeze objects but simply returns new ones. Only a few functions from ramdajs are actually useful in 2025 since JS has evolved with things like array/object destructuring and "..." but in any case it's an inspiring library. | ||
| ▲ | css_apologist 2 days ago | parent [-] | |
I've used these things. IME, nowadays it's not worth the bundle size to bring in these libraries. map/filter seems quite optimized, {...obj} is fast since its only shallow. Yes it does come down to practices vs better libraries, but in practice, it's fine IME One thing to note, i was very excited that we have a bunch of lazy methods on Iterator protocol, but they are slow as shit as of earlier this year. I wrote a parser a year ago with extreme use of .map (every step of processing was cloning the tokens), and i thought, let's migrate it to lazy iterator - it got ~10x slower :(. The compile time was fine for my use cases, so i didn't give immutable-js a try, but it was a surprise I did some benchmarks on map+filter vs mutable for of + push, and on firefox up to 200-300 elements map+filter is actually faster (unforunately not on chrome). Of course, its not the best, but my experience is that modern js engines are fast enough for most use cases to not have to bring in any libraries anymore | ||