Remix.run Logo
machinationu 7 hours ago

the issue is there is very little text before the internet, so not enough historical tokens to train a really big model

lm28469 24 minutes ago | parent | next [-]

> the issue is there is very little text before the internet,

Hm there is a lot of text from before the internet, but most of it is not on internet. There is a weird gap in some circles because of that, people are rediscovering work from pre 1980s researchers that only exist in books that have never been re-edited and that virtually no one knows about.

concinds 2 hours ago | parent | prev | next [-]

And it's a 4B model. I worry that nontechnical users will dramatically overestimate its accuracy and underestimate hallucinations, which makes me wonder how it could really be useful for academic research.

tgv 5 hours ago | parent | prev [-]

I think not everyone in this thread understands that. Someone wrote "It's a time machine", followed up by "Imagine having a conversation with Aristotle."