I mean if you’re buying it just as an LLM inference server it’s not, but most people already have laptops, in which case it’s practically free