| ▲ | zeta0134 8 hours ago | ||||||||||||||||
The need to use optimal patterns didn't go away, but the techniques certainly did. Just as a quick example, it's usually a bad idea now to use lookup tables to accelerate small math workloads. The lookup table creates memory pressure on the cache, which ends up degrading performance on modern systems. Back in the 1980s, lookup tables were by far the dominant technique because math was *slow.* | |||||||||||||||||
| ▲ | zozbot234 3 hours ago | parent | next [-] | ||||||||||||||||
> Back in the 1980s, lookup tables were by far the dominant technique because math was slow. This actually generalizes in a rather clean way: compared to the 1980s, you now want to cheaply compress data in memory and use succinct representations as much as practicable, since the extra compute involved in translating a more succinct representation into real data is practically free compared to even one extra cacheline fetch from RAM (which is now hundreds of cycles latency, and in parallel code often has surprisingly low throughput). | |||||||||||||||||
| |||||||||||||||||
| ▲ | jacquesm 3 hours ago | parent | prev [-] | ||||||||||||||||
The way to approach this is to benchmark and then pick the best solution. | |||||||||||||||||