Remix.run Logo
elzbardico 11 hours ago

It is more like Apple have no need to spend billions on training with questionable ROI when it can just rent from one of the commodity foundation model labs.

nosman 9 hours ago | parent | next [-]

I don't know why people automatically jump to Apple's defense on this.... They absolutely did spend a lot of money and hired people to try this. They 100% do NOT have the open and bottom-up culture needed to pull off large scale AI and software projects like this.

Source: I worked there

elzbardico 9 hours ago | parent [-]

Well, they stopped.

Culture is overrated. Money talks.

They did things far more complicated from an engineering perspective. I am far more impressed by what they accomplished along TSMC with Apple Silicon than by what AI labs do.

tech-historian 7 hours ago | parent [-]

Is Apple silicon really that impressive compared to LLMs? Take a step back. CPUs have been getting faster and more efficient for decades.

Google invented the transformer architecture, the backbone of modern LLMs.

Terretta 4 hours ago | parent | next [-]

> Google invented...

"Google" did? Or humans who worked there and one who didn't?

https://www.wired.com/story/eight-google-employees-invented-...

In any case, see the section on Jakob Uszkoreit, for example, or Noam Shazeer. And then…

> In the higher echelons of Google, however, the work was seen as just another interesting AI project. I asked several of the transformers folks whether their bosses ever summoned them for updates on the project. Not so much. But “we understood that this was potentially quite a big deal,” says Uszkoreit.

Worth noting the value of “bosses” who leave people alone to try nutty things in a place where research has patronage. Places like universities, Xerox, or Apple and Google deserve credit for providing the petri dish.

xmcqdpt2 3 hours ago | parent | prev [-]

You can understand how transformers work from just reading the Attention is All You Need paper, which is 15 pages of pretty accessible DL. That's not the part that is impressive about LLMs.

aurareturn an hour ago | parent | prev [-]

It’s such a commodity that there are only 3 SOTA labs left and no one can catch them. I’m sure it’ll be consolidated further in the future and you’re going to be left with a natural monopoly or duopoly.

Apple has no control over the most important change to tech. They have control to Google.

qcnguy an hour ago | parent [-]

Four. You forgot xAI. And that's ignoring the Chinese labs.

aurareturn 27 minutes ago | parent | next [-]

Chinese labs aren’t SOTA due to lack of compute.

Yes I forgot xAI. So 4 left. I’m betting that there will be one or two dominant ones in next 10 years. Apple won’t be one of them.

27 minutes ago | parent | prev [-]
[deleted]