| ▲ | polotics 18 hours ago | |
As a sizable share of the market is going to want to use this for local LLMs, I do not think this is that misleading. | ||
| ▲ | bigyabai 7 hours ago | parent [-] | |
Most people I know are not using TinyGrad for inference, but CUDA or Vulkan (neither of which are provided here). | ||