Remix.run Logo
dragonwriter 2 days ago

> Trying to run open weight modesl to do inference is something 99% of people around the world can't do because it's expensive and technically challenging and the results are poor compared to the main companies.

Just because a model is open doesn't mean that there aren't services that will run it for you (and which won't share any limits that the commercial model vendors impose to fight distillation because neither the host not the model creator cares if you are using the service to distill the model.)

Many users of, particularly the larger, open models now are using such services, not running them using their own local or cloud compute.