Remix.run Logo
lagosfractal42 5 hours ago

GPUs have massive applications such as Alphafold, CRISPR, Medical Imaging, Meteorology.

The massive planetary investment is not to make more AI chats that summarize text. That's just short sighted.

counters 4 hours ago | parent | next [-]

> Meteorology

It seems like that at first glance. But in reality, GPUs have had extremely slow adoption for real-world operational meteorology applications. Because of the fundamental design and architecture of most NWP systems, it was very difficult to leverage GPUs as compute accelerators; most efforts barely eked out any performance gains once you account for host/device memory transfers. It really wasn't until some groups started to design new weather modeling systems from the ground up that they could architect things in such a way that GPUs made a significant difference.

Obviously AI / ML weather modeling is a different story.

munk-a 4 hours ago | parent | prev | next [-]

As someone working in a field that has used NLP for quite some time - yeah, I generally agree that those investments are worth their weight in gold... which is unfortunate because before ChatGPT came along they were viewed as niche unprofitable money-sinks. The astronomical investments we've seen lately have been in general models which can be leveraged to outperform some of our older models but had we wanted purely to improve those models there were much more efficient ways to do so.

Hopefully we can retain a lot of this value when the bubble bursts but I just haven't seen any really good success stories of converting these models into businesses. If you try and build as a middleman where you leverage a model to solve someone's problem they can always just go to the model runner and get the same results for cheaper - and the model runners seem (so far - this may change) to be unable to price model usage at a level that actually makes it sustainable.

Those older models running specialized tasks seem to be trucking along just fine for now - but I remain concerned that when the bubble bursts it's going to starve these necessary investments of capital.

foobarian 4 hours ago | parent [-]

> converting these models into businesses.

I think it's pretty clear to all the big operators that they will need to go whole hog into ads and take some of the Google/Meta pie. It's just a matter of time.

KalMann 5 hours ago | parent | prev | next [-]

You're missing the point. Those kind of narrow AI applications are not the motivation for the trillions of dollars being poured into AI. Of course AI has a variety of applications many disciplines, as it has for decades. The motivation behind the massive investment in AI is as forgetfulness said, reap the benefits from "revolutionizing the workplace"

ares623 5 hours ago | parent | prev | next [-]

That’s copium, as the kids say nowadays. The massive planetary investment is a 100% for AI chats. All those other things are taking the crumbs where they can.

chickensong 4 hours ago | parent [-]

Big business and government aren't buying supercomputer clusters and licensing models to run chats.

munk-a 3 hours ago | parent [-]

The really weird thing is that Big Business actually is buying supercomputer clusters to do just that. I can't really talk to the government side but a lot of businesses' early forays into AI was just slapping a chatbot on their product and hoping it'd attract a lot more business. I also think you'd be surprised how integrated really dumb chatbots are into business communication these days.

I think most smart people are looking seriously at different models to try and improve the accuracy of any existing ML uses they had in their business but the new features built post-ChatGPT tend to often just be fancied up chats.

badlogic 13 minutes ago | parent | next [-]

I can talk for the gov. site in my European home country: they too are buying GPUs for chat ...

chickensong 2 hours ago | parent | prev [-]

> just slapping a chatbot on their product

That's happening of course, but that's not really the whole picture. Any org that already invests in R&D is likely considering or already implementing modern AI tech into their existing infrastructure. A big oil or pharmaceutical or materials company likely doesn't care much about chat bots, or any customer-facing tech for that matter.

refactor_master 33 minutes ago | parent [-]

Actually, big orgs are doing exactly that; slapping a chatbot onto their support ticket backlog. Being really, actually “data driven” is hard, and must happen from the bottom up. So instead there’s chatbots in their frontend and support backend, but the backend doing the actual lifting probably hasn’t changed one bit.

hyperbovine 4 hours ago | parent | prev [-]

Eh, those applications (incl. protein folding) existed for a decade-plus before LLMs came onto the scene, and there was absolutely nothing like the scale of capex that we're seeing right now. It's like literally 100-1000x larger than what GPU hosting providers were spending previously.