| ▲ | jt2190 11 hours ago |
| Last paragraph is informative: > Anthropic relies heavily on a combination of chips designed by Amazon Web Services known as Trainium, as well as Google’s in-house designed TPU processors, to train its AI models. Google largely uses its TPUs to train Gemini. Both chips represent major competitive threats to Nvidia’s best-selling products, known as graphics processing units, or GPUs. So which leading AI company is going to build on Nvidia, if not OpenAI? |
|
| ▲ | paxys 11 hours ago | parent | next [-] |
| "Largely" is doing a lot of heavy lifting here. Yes Google and Amazon are making their own GPU chips, but they are also buying as many Nvidia chips as they can get their hands on. As are Microsoft, Meta, xAI, Tesla, Oracle and everyone else. |
| |
| ▲ | greiskul 10 hours ago | parent | next [-] | | But is Google buying those GPU chips for their own use, or to have them on their data centers for their cloud customers? | | |
| ▲ | dekhn 10 hours ago | parent | next [-] | | google buys nvidia GPUs for cloud, I don't think they use them much or at all internally. The TPUs are both used internally, and in cloud, and now it looks like they are delivering them to customers in their own data centers. | | |
| ▲ | hansvm 9 hours ago | parent | next [-] | | When I was there a few years ago, we only got CPUs and GPUs for training. TPUs were in too high of demand. | |
| ▲ | moralestapia 10 hours ago | parent | prev [-] | | I can see them being used for training if they're vacant. |
| |
| ▲ | notyourwork 9 hours ago | parent | prev [-] | | Both. Internal are customers too. |
| |
| ▲ | bredren 9 hours ago | parent | prev [-] | | How about Apple? How is Apple training its next foundation models? | | |
| ▲ | consumer451 9 hours ago | parent | next [-] | | To use the parlance of this thread: "next" foundation models is doing a lot of heavy lifting here. Am I doing this right? My point is, does Apple have any useful foundation models? Last I checked they made a deal with OpenAI, no wait, now with Google. | | |
| ▲ | wmf 8 hours ago | parent | next [-] | | Apple does have their own small foundation models but it's not clear they require a lot of GPUs to train. | | |
| ▲ | consumer451 6 hours ago | parent | next [-] | | Do you mean like OCR in photos? In that case, yes, I didn't think about that. Are there other use cases aside from speach to text in Siri? | | | |
| ▲ | 6 hours ago | parent | prev [-] | | [deleted] |
| |
| ▲ | system2 9 hours ago | parent | prev [-] | | I think Apple is waiting for the bubble to deflate, then do something different. And they have the ready to use user base to provide what they can make money from. | | |
| ▲ | amluto 7 hours ago | parent | next [-] | | If they were taking that approach, they would have absolutely first-class integration between AI tools and user data, complete with proper isolation for security and privacy and convenient ways for users to give agents access to the right things. And they would bide their time for the right models to show up at the right price with the right privacy guarantees. I see no evidence of this happening. | | |
| ▲ | irishcoffee 4 hours ago | parent [-] | | As an outsider, the only thing the two of you disagree on is timing. I probably side with the ‘time is running out’ team at the current juncture. |
| |
| ▲ | ymyms 8 hours ago | parent | prev | next [-] | | They apparently are working on and are going to release 2(!) different versions of siri. Idk, that just screams "leadership doesn't know what to do and can't make a tough decision" to me. but who knows? maybe two versions of siri is what people will want. | | |
| ▲ | consumer451 8 hours ago | parent [-] | | Arena mode! Which reply do you prefer? /s But seriously, would one be for newer phone/tablet models, and one for older? | | |
| ▲ | pinnochio 7 hours ago | parent [-] | | It sounds like the first one, based on Gemini, will be more a more limited version of the second ("competitive with Gemini 3"). IDK if the second is also based on Gemini, but I'd be surprised if that weren't the case. Seems like it's more a ramp-up than two completely separate Siri replacements. |
|
| |
| ▲ | aurareturn 7 hours ago | parent | prev [-] | | Apple can make more money from shorting the stock market, including their own stock, if they believe the bubble will deflate. |
|
| |
| ▲ | xvector 8 hours ago | parent | prev | next [-] | | Apple is sitting this whole thing out. Bizarre. | | |
| ▲ | vessenes 6 hours ago | parent | next [-] | | I mean, they tried. They just tried and failed. It may work out for them, though — two years ago it looked like lift-off was likely, or at least possible, so having a frontier model was existential. Today it looks like you might be able to save many billions by being a fast follower. I wouldn’t be surprised if the lift-off narrative comes back around though; we still have maybe a decade until we really understand the best business model for LLMs and their siblings. | | |
| ▲ | tonyedgecombe 5 hours ago | parent [-] | | I think you are right. Their generative AI was clearly underwhelming. They have been losing many staff from their AI team. I’m not sure it matters though. They just had a stonking quarter. iPhone sales are surging ahead. Their customers clearly don’t care about AI or Siri’s lacklustre performance. | | |
| ▲ | 9dev 3 hours ago | parent | next [-] | | > Their customers clearly don’t care about AI or Siri’s lacklustre performance. I would rather say their products didn’t just loose in value for not getting an improvement there. Everyone agrees that Siri sucks, but I’m pretty sure they tried to replace it with a natural language version built from the ground up, and realised it just didn’t work out yet: yes, they have a bad, but at least kinda-working voice assistant with lots of integrations into other apps. But replacing that with something that promises to do stuff and then does nothing, takes long to respond, and has less integrations due to the lack of keywords would have been a bad idea if the technology wasn’t there yet. | |
| ▲ | irishcoffee 4 hours ago | parent | prev [-] | | Honestly, what it seems like is financial discipline. | | |
|
| |
| ▲ | catdog 3 hours ago | parent | prev | next [-] | | Well they tried and they failed. In that case maybe the smartest move is not to play. Looks like the technology is largely turning into a commodity in the long run anyways. So sitting this out and letting others make the mistakes first might not be the worst of all ideas. | |
| ▲ | runako 6 hours ago | parent | prev | next [-] | | This whole thread is about whether the most valuable startup of all time will be able to raise enough money to see the next calendar year. It's definitely rational to decide to pay wholesale for LLMs given: - consumer adoption is unclear. The "killer app" for OS integration has yet to ship by any vendor. - owning SOTA foundation models can put you into a situation where you need to spend $100B with no clear return. This money gets spent up front regardless of how much value consumers derive from the product, or if they even use it at all. This is a lot of money! - as apple has "missed" the last couple of years of the AI craze, there has been no meaningful ill effects to their business. Beyond the tech press, nobody cares yet. | |
| ▲ | cs_sorcerer 7 hours ago | parent | prev | next [-] | | From a technology standpoint I don’t feel Apple’s core competency is in AI model foundations | |
| ▲ | random_duck 7 hours ago | parent | prev [-] | | They might know something? | | |
| ▲ | leptons 7 hours ago | parent [-] | | More like they don't know the things others do. Siri is a laughing stock. |
|
| |
| ▲ | downrightmike 8 hours ago | parent | prev [-] | | They are in housing their AI to sell it as a secure way to AI, which 100% puts them in the lead for the foreseeable future. |
|
|
|
| ▲ | mcintyre1994 3 hours ago | parent | prev | next [-] |
| That’s interesting, I didn’t know that about Anthropic. I guess it wouldn’t really make sense to compete with OpenAI and everyone else for Nvidia chips if they can avoid it. |
|
| ▲ | wmf 11 hours ago | parent | prev | next [-] |
| OpenAI will keep using Nvidia GPUs but they may have to actually pay for them. |
|
| ▲ | Morromist 11 hours ago | parent | prev | next [-] |
| Nvidia had the chance to build its own AI software and chose not to. It was a good choice so far, better to sell shovels than go to the mines - but they still could go mining if the other miners start making their own shovels. If I were Nvidia I would be hedging my bets a little. OpenAI looks like it's on shaky ground, it might not be around in a few years. |
| |
| ▲ | vessenes 6 hours ago | parent | next [-] | | They do build their own software, though. They have a large body of stuff they make. My guess is that it’s done to stay current, inform design and performance, and to have something to sell enterprises along with the hardware; they have purposely not gone after large consumer markets with their model offerings as far as I can tell. | |
| ▲ | snypher 11 hours ago | parent | prev | next [-] | | Another comment had this: https://blogs.nvidia.com/blog/open-models-data-tools-acceler... Interesting times. | | | |
| ▲ | system2 8 hours ago | parent | prev [-] | | There is no way Nvidia can make even a fraction of what they are making from AI software. |
|
|
| ▲ | dylan604 11 hours ago | parent | prev | next [-] |
| Would Nvidia investing heavily in ClosedAI dissuade others to use Nvidia? |
| |
|
| ▲ | raincole 10 hours ago | parent | prev | next [-] |
| Literally all the other companies that still believe they can be the leading ones one day? |
|
| ▲ | nick49488171 10 hours ago | parent | prev | next [-] |
| Maybe xAI/Tesla, Meta, Palantir |
|
| ▲ | rvz 2 hours ago | parent | prev | next [-] |
| It's almost as if everyone here was assuming that Nvidia would have no competition for a long time, but it has been known for a long time, there are many competitors coming after their data center revenues. [0] > So which leading AI company is going to build on Nvidia, if not OpenAI? It's xAI. But what matters is that there is more competition for Nvidia and they bought Groq to reduce that. OpenAI is building their own chips as well as Meta. The real question is this: What happens when the competition catches up with Nvidia and takes a significant slice out of their data center revenues? [0] https://news.ycombinator.com/item?id=45429514 |
|
| ▲ | lofaszvanitt 9 hours ago | parent | prev | next [-] |
| The moment you threaten NVDA's livelyhood, your company starts to fall apart. History tells. |
|
| ▲ | dfajgljsldkjag 11 hours ago | parent | prev [-] |
| the chinese will probably figure out a way to sneak the nvidia chips around the sanctions |
| |