▲ | chlobunnee 4 days ago | |
I built a calculator to help researchers and engineers pick the right GPUs for training and inference workloads! It helps compare GPU options by taking in simple parameters (# of transformer layers, token size, etc) and letting users know which GPUs are compatible + their efficiency for training vs inferencing. The idea came from talking with ML researchers frustrated by slow cluster queues or wasting money on overkill GPUs. I'd love feedback on what you feel is missing/confusing! Some things I'm thinking about incorporating next are >Allowing users to directly compare 2 GPUs and their specs >Allowing users to see whether a fraction of the GPU can complete their workload I would really appreciate your thoughts/feedback! Thanks! |