Remix.run Logo
Oras 4 hours ago

> But this is not an applied AI company.

There is absolutely no doubt about Yann's impact on AI/ML, but he had access to many more resources in Meta, and we didn't see anything.

It could be a management issue, though, and I sincerely wish we will see more competition, but from what I quoted above, it does not seem like it.

Understanding world through videos (mentioned in the article), is just what video models have already done, and they are getting pretty good (see Seedance, Kling, Sora .. etc). So I'm not quite sure how what he proposed would work.

torginus 2 hours ago | parent | next [-]

Most folks get paid a lot more in a corporate job than tinkering at home - using the 'follow the money' logic it would make sense they would produce their most inspired works as 9-5 full stack engineers.

But often passion and freedom to explore are often more important than resources

stein1946 3 hours ago | parent | prev | next [-]

> There is absolutely no doubt about Yann's impact on AI/ML, but he had access to many more resources in Meta, and we didn't see anything.

That's true for 99% of the scientists, but dismissing their opinion based on them not having done world shattering / ground breaking research is probably not the way to go.

> I sincerely wish we will see more competition

I really wish we don't, science isn't markets.

> Understanding world through videos

The word "understanding" is doing a lot of heavy lifting here. I find myself prompting again and again for corrections on an image or a summary and "it" still does not "understand" and keeps doing the same thing over and over again.

nashadelic 37 minutes ago | parent | prev | next [-]

Your take is brutal but spot on

boccaff 4 hours ago | parent | prev | next [-]

llama models pushed the envelope for a while, and having them "open-weight" allowed a lot of tinkering. I would say that most of fine tuned evolved from work on top of llama models.

oefrha 3 hours ago | parent [-]

Llama wasn’t Yann LeCun’s work and he was openly critical of LLMs, so it’s not very relevant in this context.

Source: himself https://x.com/ylecun/status/1993840625142436160 (“I never worked on any Llama.”) and a million previous reports and tweets from him.

alecco a few seconds ago | parent [-]

> My only contribution was to push for Llama 2 to be open sourced.

Quite a big contribution in practice.

YetAnotherNick 2 hours ago | parent | prev | next [-]

> we didn't see anything.

Is it a troll? Even if we just ignore Llama, Meta invented and released so many foundational research and open source code. I would say that the computer vision field would be years behind if Meta didn't publish some core research like DETR or MAE.

_giorgio_ 4 hours ago | parent | prev | next [-]

I can’t reconcile this dichotomy: most of the landmark deep learning papers were developed with what, by today’s standards, were almost ridiculously small training budgets — from Transformers to dropout, and so on.

So I keep wondering: if his idea is really that good — and I genuinely hope it is — why hasn’t it led to anything truly groundbreaking yet? It can’t just be a matter of needing more data or more researchers. You tell me :-D

samrus 3 hours ago | parent [-]

Its a matter of needing more time, which is a resource even SV VCs are scared to throw around. Look at the timeline of all these advancements and how long it took

Lecun introduced backprop for deep learning back in 1989 Hinton published about contrastive divergance in next token prediction in 2002 Alexnet was 2012 Word2vec was 2013 Seq2seq was 2014 AiAYN was 2017 UnicornAI was 2019 Instructgpt was 2022

This makes alot of people think that things are just accelerating and they can be along for the ride. But its the years and years of foundational research that allows this to be done. That toll has to be paid for the successsors of LLMs to be able to reason properly and operate in the world the way humans do. That sowing wont happen as fast as the reaping did. Lecun was to plant those seeds, the others who onky was to eat the fruit dont get that they have to wait

_giorgio_ 16 minutes ago | parent [-]

If his ideas had real substance, we would have seen substantial results by now. He introduced I-JEPA in 2023, so almost three years ago at this point.

If he still hasn’t produced anything truly meaningful after all these years at Meta, when is that supposed to happen? Yann LeCun has been at Facebook/Meta since December 2013.

Your chronological sequence is interesting, but it refers to a time when the number of researchers and the amount of compute available were a tiny fraction of what they are today.

the_real_cher 4 hours ago | parent | prev [-]

He was suffocated by the corporate aspect Meta I suspect.