| ▲ | xg15 7 hours ago | |
> However, this technology is far too important to be in the hands of a few companies. I worry less about the model access and more about the hardwire required to run those models (i.e. do inference). If a) the only way to compete in software development in the future is to outsource the entire implementation process to one of a few frontier models (Chinese, US or otherwise) and b) only a few companies worldwide have the GPU power to run inference with those models in a reasonable time then don't we already have a massive amount of centralization? That is also something I keep wondering with agentic coding - being able to realize your epic fantasy hobby project you've on and off been thinking about for the last years in a couple of afternoons is absolutely amazing. But if you do the same with work projects, how do you solve the data protection issues? Will we all now just hand our entire production codebases to OpenAI or Anthropic etc and hope their pinky promises hold? Or will there be a race for medium-sized companies to have their own GPU datacentets, not for production but solely for internal development and code generation? | ||