▲ | hchja a day ago | |
This is pretty useless in any case that doesn’t involve BFloat16 models | ||
▲ | spindump8930 21 hours ago | parent | next [-] | |
bf16 is the defacto default datatype and distribution type for LLMs, which are then often eagerly quantized by users with more limited hardware. See the recent Llama releases and e.g. the H100 spec sheet (advertised flops and metrics target bf16). | ||
▲ | throwaway314155 21 hours ago | parent | prev [-] | |
So an increasingly smaller number of cases? |