| ▲ | MrBuddyCasino 6 days ago |
| Simpler algorithms are usually faster for small N, and N is usually small. Big O assumption is fantasy world where N is always large, and constant factors that slow down an algorithm can be ignored. |
|
| ▲ | nottorp 6 days ago | parent | next [-] |
| https://news.ycombinator.com/item?id=26296339 Small N right? |
| |
|
| ▲ | MattPalmer1086 6 days ago | parent | prev | next [-] |
| Yeah, constant factors are not represented and often make a big real world difference. Good point. |
|
| ▲ | coldtea 6 days ago | parent | prev [-] |
| >Simpler algorithms are usually faster for small N, and N is usually small. This mentality is how we ended up with cpu and memory hogging Electron apps... |
| |
| ▲ | JohnKemeny 6 days ago | parent | next [-] | | That's not an accurate description. For example, it is common in sorting algorithms to do an n² algorithm like bubble sort when the list to sort is small (e.g. <50), whereas when it's larger, we do an n log n algorithm like merge sort. The issue is that merge sort does recursion, which comes with some extra cost, so that an n² algorithm beats an n log n algorithm, provided n is small. It has nothing with your criticism to do. | | |
| ▲ | Al-Khwarizmi 5 days ago | parent [-] | | You can (and probably should) add a threshold to recursive algorithms like mergesort so that they don't end up doing recursive calls on very small arrays. For arrays smaller than the threshold, insertion is faster than bubble sort. And if you really don't want recursion for large arrays to begin with, you have heapsort or Shell sort. I don't think there is any practical case where using bubble sort makes sense, except "efficiency doesn't matter for my use case and this is the code I already have". | | |
| ▲ | graycat 5 days ago | parent [-] | | As in one of volumes of Knuth's The Art of Computing, Heap sort is in place and achieves the Gleason bound so can be regarded as not beaten by quick sort. |
|
| |
| ▲ | MrBuddyCasino 6 days ago | parent | prev [-] | | Electron a) isn’t really slow for what it does b) introduces many layers of abstraction, leading to larger memory consumption compared to „native“ apps c) is certainly not algorithmically unoptimized, it runs on a software stack that has been tuned like few others using billions of dollars You just loosely associated words and concepts that occupy a similar emotional niche. |
|