Remix.run Logo
lovelearning 2 hours ago

It's _relatively_ democratic when compared to these counterfactual gatekeeping scenarios:

- What if these centralized providers had restricted their LLMs to a small set of corporations / nations / qualified individuals?

- What if Google that invented the core transformer architecture had kept the research paper to themselves instead of openly publishing it?

- What if the universities / corporations, who had worked on concepts like the attention mechanism so essential for Google's paper, had instead gatekept it to themselves?

- What if the base models, recipes, datasets, and frameworks for training our own LLMs had never been open-sourced and published by Meta/Alibaba/DeepSeek/Mistral/many more?