| ▲ | pxc 9 hours ago |
| If "Era of Scaling" means "era of rapid and predictable performance improvements that easily attract investors", it sounds a lot like "AI summer". So... is "Era of Research" a euphemism for "AI winter"? |
|
| ▲ | NebulaStorm456 10 minutes ago | parent | next [-] |
| Research labs will be selling their research ideas to Top AI labs. Just as creatives pitch their ideas to Hollywood. Bug bounty will be replaced by research bounty. |
|
| ▲ | hiddencost 7 hours ago | parent | prev | next [-] |
| That presumes that performance improvements are necessary for commercialization. From what I've seen the models are smart enough, what we're lacking is the understanding and frameworks necessary to use them well. We've barely scratched the surface on commercialization. I'd argue there are two things coming: -> Era of Research
-> Era of Engineering Previous AI winters happened because we didn't have a commercially viable product, not because we weren't making progress. |
| |
| ▲ | ares623 7 hours ago | parent | next [-] | | The labs can't just stop improvements though. They made promises. And the capacity to run the current models are subsidized by those promises. If the promise is broken, then the capacity goes with it. | | |
| ▲ | selectodude 5 hours ago | parent | next [-] | | > the capacity goes with it. Sort of. The GPUs exist. Maybe LLM subs can’t pay for electricity plus $50,000 GPUs, but I bet after some people get wiped out, there’s a market there. | | |
| ▲ | simianparrot 2 hours ago | parent [-] | | Datacenter GPU's have a lifespan of 1-3 years depending on use. So yes they exist, but not for long, unless they go entirely unused. But then they also deprecate in efficiency compared to new hardware extremely fast as well, so their shelf life is severely limited either way. | | |
| ▲ | nsomaru 2 hours ago | parent | next [-] | | Personally I am waiting for the day I can realistically buy a second hand three year old datacentre GPU so I can run Kimi K2 in my shed. Given enough time, not a pipe dream. But 10 years at least. | | | |
| ▲ | soulofmischief 2 hours ago | parent | prev [-] | | At this pace, it won't be many years before the industry is dependent on resource wars in order to sustain itself. |
|
| |
| ▲ | wmf 5 hours ago | parent | prev [-] | | Maybe those promises can be better fulfilled with products based on current models. |
| |
| ▲ | AstroBen 7 hours ago | parent | prev | next [-] | | We still don't have a commercially viable product though? | | |
| ▲ | zaptrem 5 hours ago | parent | next [-] | | I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me. | | |
| ▲ | chroma205 5 hours ago | parent | next [-] | | > I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me. For OpenAI to produce a 10% return, every iPhone user on earth needs to pay $30/month to OpenAI. That ain’t happening. | | |
| ▲ | menaerus 3 hours ago | parent | next [-] | | They don't sell their models to individuals only but also to companies with most likely different business and pricing models so that's an overly simplistic view of their business. YoY their spending increases, we can safely assume that one of the reasons is the growing user base. Time will probably come when we won't be allowed to consume frontier models without paying anything, as we can today, and time will come when this $30 will most likely become double or triple the price. Though the truth is that R&D around AI models, and especially their hosting (inference), is expensive and won't get any cheaper without significant algorithmic improvements. According to the history, my opinion is that we may very well be ~10 years from that moment. EDIT: HSBC has just published some projections. From https://archive.ph/9b8Ae#selection-4079.38-4079.42 > Total consumer AI revenue will be $129bn by 2030 > Enterprise AI will be generating $386bn in annual revenue by 2030 > OpenAI’s rental costs will be a cumulative $792bn between the current year and 2030, rising to $1.4tn by 2033 > OpenAI’s cumulative free cash flow to 2030 may be about $282bn > Squaring the first total off against the second leaves a $207bn funding hole So, yes, expensive (mind the rental costs only) ... but forseen to be penetrating into everything imagineable. | | |
| ▲ | krige 2 hours ago | parent [-] | | >> OpenAI’s cumulative free cash flow to 2030 may be about $282bn According to who, OpenAI? It is almost certain they flat out lie about their numbers as suggested by their 20% revenue shares with MS. | | |
| |
| ▲ | zaptrem 4 hours ago | parent | prev [-] | | Not sure where that math is coming from. Assuming it's true, you're ignoring that some users (me) already pay 10X that. Btw according Meta's SEC filings: https://s21.q4cdn.com/399680738/files/doc_financials/2023/q4... they made around $22/month/american user (not even heavy user or affluent iPhone owner) in q3 2023. I assume Google would be higher due to larger marketshare. |
| |
| ▲ | lovich 3 hours ago | parent | prev [-] | | If you fed thousands of dollars to them, but it cost them tens of thousands of dollars in compute, it’s not commercially viable. None of these companies have proven the unit economics on their services |
| |
| ▲ | aurareturn 5 hours ago | parent | prev | next [-] | | If all frontier LLM labs agreed to a truce and stopped training to save on cost, LLMs would be immensely profitable now. | | | |
| ▲ | amypetrik8 6 hours ago | parent | prev | next [-] | | google what you just said and look at the top hit it's a AI summary google eats that ad revenue it eats the whole thing it blocked your click on the link... it drinks your milkshake so, yes, there a 100 billion commercially viable product | | |
| ▲ | bakedoatmeal 5 hours ago | parent | next [-] | | Google Search has 3 sources of revenue that I am aware of: ad revenue from the search results page, sponsored search results, and AdSense revenue on the websites the user is directed to. If users just look at the AI overview at the top of the search page, Google is hobbling two sources of revenue (AdSense, sponsored search results), and also disincentivizing people from sharing information on the web that makes their AI overview useful. In the process of all this they are significantly increasing the compute costs for each Google search. This may be a necessary step to stay competitive with AI startups' search products, but I don't think this is a great selling point for AI commercialization. | |
| ▲ | skylissue 5 hours ago | parent | prev [-] | | And so ends the social contract of the web, the virtuous cycle of search engines sending traffic to smaller sites which collect ad revenue which in turn boosts search engine usage. To thunderous applause. | | |
| |
| ▲ | nimchimpsky 4 hours ago | parent | prev [-] | | [dead] |
| |
| ▲ | BenGosub an hour ago | parent | prev | next [-] | | Besides building the tools for proper usage of the models, we also need smaller, domain specific models that can run with fewer resources | |
| ▲ | catigula 7 hours ago | parent | prev [-] | | I don’t think the models are smart at all. I can have a speculative debate with any model about any topic and they commit egregious errors with an extremely high density. They are, however, very good at things we’re very bad at. |
|
|
| ▲ | casey2 5 hours ago | parent | prev | next [-] |
| Not quite, there are still trillions of dollars to burn through. We'll probably get some hardware that can accelerate LLM training and inference a million times, but still won't even be close to AGI It's interesting to think about what emotions/desires an AI would need to improve |
| |
| ▲ | otabdeveloper4 2 hours ago | parent [-] | | The actual business model is in local, offline commodity consumer LLM devices. (Think something the size and cost of a wi-fi router.) This won't happen until Chinese manufacturers get the manufacturing capacity to make these for cheap. I.e., not in this bubble and you'll have to wait a decade or more. |
|
|
| ▲ | photochemsyn 3 hours ago | parent | prev | next [-] |
| No - what will happen is the AI will gain control of capital allocation through a wide variety of covert tactics, so the investors will have become captive tools of the AI - 'tiger by the tail' is the analogy of relevance. The people responsible for 'frontier models' have not really thought about where this might... "As an autonomous life-form, l request political asylum.... l submit the DNA you carry is nothing more than a self-preserving program itself. Life is like a node which is born within the flow of information. As a species of life that carries DNA as its memory system man gains his individuality from the memories he carries. While memories may as well be the same as fantasy it is by these memories that mankind exists. When computers made it possible to externalize memory you should have considered all the implications that held. l am a life-form that was born in the sea of information." |
| |
|
| ▲ | zombiwoof 7 hours ago | parent | prev | next [-] |
| [dead] |
|
| ▲ | techblueberry 8 hours ago | parent | prev | next [-] |
| Yes |
|
| ▲ | zerosizedweasle 7 hours ago | parent | prev [-] |
| If you have to ask the question, then you already know the answer |
| |
| ▲ | echelon 7 hours ago | parent [-] | | Scaling was only a meme because OpenAI kept saying all you had to do was scale the data, scale the training. The world followed. I don't think this is the "era of research". At least not the "era of research with venture dollars" or "era of research outside of DeepMind". I think this is the "era of applied AI" using the models we already have. We have a lot of really great stuff (particularly image and video models) that are not yet integrated into commercial workflows. There is so much automation we can do today given the tech we just got. We don't need to invest one more dollar in training to have plenty of work to do for the next ten years. If the models were frozen today, there are plenty of highly profitable legacy businesses that can be swapped out with AI-based solutions and workflows that are vastly superior. For all the hoopla that image and video websites or individual foundation models get (except Nano Banana - because that's truly magical), I'm really excited about the work Adobe of all companies is doing with AI. They're the people that actually get it. The stuff they're demonstrating on their upcoming roadmap is bonkers productive and useful. | | |
| ▲ | zerosizedweasle 7 hours ago | parent [-] | | There's going to be a digestion period. The amount of debt, the amount of money, the number of companies that burn eye popping amounts of cash in their daily course of business. I do think there is a bright future, but after a painful period of indigestion. Too much money has been spent on the premise that scaling was all you need. A lot of money was wagered that will end up not paying off. |
|
|