Remix.run Logo
stkai 4 hours ago

Would love to find out they're overfitting for pelican drawings.

andy_ppp 4 hours ago | parent | next [-]

Yes, Racoon on a unicycle? Magpie on a pedalo?

throw310822 an hour ago | parent | next [-]

Correct horse battery staple:

https://claude.ai/public/artifacts/14a23d7f-8a10-4cde-89fe-0...

ta988 an hour ago | parent [-]

no staple?

iwontberude 30 minutes ago | parent [-]

it looks like a bodge wire

3 hours ago | parent | prev | next [-]
[deleted]
_kb 17 minutes ago | parent | prev [-]

Platypus on a penny farthing.

theanonymousone an hour ago | parent | prev | next [-]

Even if not intentionally, it is probably leaking into training sets.

fragmede 3 hours ago | parent | prev [-]

The estimation I did 4 months ago:

> there are approximately 200k common nouns in English, and then we square that, we get 40 billion combinations. At one second per, that's ~1200 years, but then if we parallelize it on a supercomputer that can do 100,000 per second that would only take 3 days. Given that ChatGPT was trained on all of the Internet and every book written, I'm not sure that still seems infeasible.

https://news.ycombinator.com/item?id=45455786

eli 3 hours ago | parent | next [-]

How would you generate a picture of Noun + Noun in the first place in order to train the LLM with what it would look like? What's happening during that 1 estimated second?

metalliqaz an hour ago | parent | next [-]

its pelicans all the way down

Terretta 2 hours ago | parent | prev [-]

This is why everyone trains their LLM on another LLM. It's all about the pelicans.

AnimalMuppet an hour ago | parent | prev [-]

But you need to also include the number of prepositions. "A pelican on a bicycle" is not at all the same as "a pelican inside a bicycle".

There are estimated to be 100 or so prepositions in English. That gets you to 4 trillion combinations.