Remix.run Logo
hvb2 4 days ago

This feels backwards. When you have a good understanding of data structures you have the luxury of testing.

If you focus on testing over data structures, you might end up testing something that you didn't need to test because you used the wrong data structures.

IMHO too often people dont consider big O because it works fine with their 10 row test case.... And then it grinds to a halt when given a real problem

cogman10 4 days ago | parent | next [-]

That wasn't the thrust of the article.

The article is saying that it's more important to write tests than it is to learn how to write data structures. It specifically says you should learn which data structures you should use, but don't focus on knowing how to implement all them.

It calls out, specifically, that you should know that `sort` exists but you really don't need to know how to implement quicksort vs selection sort.

hvb2 4 days ago | parent [-]

No, it says learn data structures first, then focus on testing.

You don't have to go super deep on all the sort algorithms, sure. That's like saying that learning testing implies writing a mocking library

MrJohz 4 days ago | parent [-]

I think the issue is that most DSA curriculums do go super deep on things like sort algorithms or linked lists or whatever else. Whereas testing is usually barely taught in universities or colleges, and when it is taught, it's usually very lightweight.

jancsika 4 days ago | parent | prev | next [-]

> IMHO too often people dont consider big O because it works fine with their 10 row test case.... And then it grinds to a halt when given a real problem

Not if the user can, say, farm 1,000,000 different rows 100 times over an hour and a half while gossiping with their office mates. I over Excel as Exhibit A.

matheusmoreira 4 days ago | parent | prev [-]

> too often people dont consider big O because it works fine with their 10 row test case.... And then it grinds to a halt when given a real problem

The reverse also happens frustratingly often. One could spend a lot of time obsessing over theoretical complexity only for it to amount to nothing. One might carefully choose a data structure and algorithm based on these theoretical properties and discover that in practice they get smoked by dumb contiguous arrays just because they fit in caches.

The sad fact is the sheer brute force of modern processors is often enough in the vast majority of cases so long as people avoid accidentally making things quadratic.

Sometimes people don't even do that and we get things such as the GTA5 dumpster fire.

https://news.ycombinator.com/item?id=26296339

janalsncm 4 days ago | parent [-]

Slightly related, in ML I write a lot of code which will be executed exactly once. Data analysis, creating a visualization, one-off ETL tasks.

There are a lot of times where I could spend mental energy writing “correct” code which trades off space for time etc. Sometimes, it’s worth it, sometimes not. But it’s better to spend an extra 30 seconds of CPU time running the code than an extra 10 minutes carefully crafting a function no one will see later, or that someone will see but is harder to understand. Simpler is better sometimes.

What Big O gives you is an ability to assess the tradeoffs. Computers are fast so a lot of times quadratic time doesn’t matter for small N. And you can always optimize later.