| ▲ | ares623 3 hours ago | |
how is 2-3 centralized providers of this new technology "democratization"? | ||
| ▲ | lovelearning 2 hours ago | parent | next [-] | |
It's _relatively_ democratic when compared to these counterfactual gatekeeping scenarios: - What if these centralized providers had restricted their LLMs to a small set of corporations / nations / qualified individuals? - What if Google that invented the core transformer architecture had kept the research paper to themselves instead of openly publishing it? - What if the universities / corporations, who had worked on concepts like the attention mechanism so essential for Google's paper, had instead gatekept it to themselves? - What if the base models, recipes, datasets, and frameworks for training our own LLMs had never been open-sourced and published by Meta/Alibaba/DeepSeek/Mistral/many more? | ||
| ▲ | satvikpendem 3 hours ago | parent | prev [-] | |
There are lots of open weight models | ||