Remix.run Logo
dzink 9 hours ago

“The long-term vision is: foundation models that acquire reasoning from fully synthetic data, then learn semantics from a small, curated corpus of natural language. This would help us build models that reason without inheriting human biases from inception.”

qsera 9 hours ago | parent [-]

I think this is a bit risky, because it assumes that all knowledge that a human posses about nature is acquired after birth.

But is that correct? I think organisms also come with a partial built in understanding of nature at birth.

throw-qqqqq 8 hours ago | parent | next [-]

> I think organisms also come with a partial built in understanding of nature at birth

I agree. Most organisms are quite pre-trained: they have “instincts” and natural behaviors.

E.g. newly hatched turtles know to crawl towards the ocean immediately when they hatch. They don’t learn that on their way.

It seems to me that most lifeforms come into this world pre-trained.

jamilton 8 hours ago | parent | prev [-]

I don’t think that assumption is being made, why do you think that? In terms of metaphor, training a model could be considered both knowledge acquired after birth and its evolution. But I don’t think it’s particularly useful to stay thinking in metaphors.