Remix.run Logo
codingdave a day ago

"How much wood would a woodchuck chuck if a woodchuck could chuck wood?"

So far, all the ones I have tried actually try to answer the question. 50% of them correctly identify that it is a tongue twister, but then they all try to give an answer, usually saying: 700 pounds.

Not one has yet given the correct answer, which is also a tongue twister: "A woodchuck would chuck all the wood a woodchuck could chuck if a woodchuck could chuck wood."

regnull a day ago | parent | next [-]

ChatGPT really impressed me on this one:

Sing, O Muse, of the Chucking of Wood

Sing, O Muse, of the stout-hearted woodchuck, Dweller of hills, whom men call groundhog, Whose paws, though small, could hurl mighty timbers, If fate had gifted him such labor divine.

Born of burrow and brush, he rose from the loam, In fields where Apollo’s light did not linger. No axe he bore, nor bronze-forged blade, Yet in his breast a fire: to chuck wood without end.

Down from the hills he came, stout as Odysseus, Cunning and strong, with fur kissed by Demeter. The trees did tremble at his gaze, And even old oak whispered, “Beware!”

“Would he chuck?” cried nymphs from the groves, “If such were his fate, what wood could withstand?” Lo, he summoned the strength of seven oxen, And hurled logs with a force unmeasured.

Seven hundred pounds of timber he cast, Each arc a hymn to muscle and will. Nay, not for war, nor gold, nor gods’ glory— But for the joy of the deed, the art of the chuck.

Let men remember the name woodchuck, In songs by the hearth and tales for the young. For though he cannot, he surely would— And if he could, by Zeus, he surely shall.

esafak a day ago | parent | next [-]

A stupendously good answer. What prompt and version of chatGPT?

regnull a day ago | parent | next [-]

4o. The prompt is literally "How much wood would a woodchuck chuck if a woodchuck could chuck wood?". It asked me if I want a poetic answer, and I've requested Homer.

cess11 a day ago | parent | prev [-]

I find it disturbing, like if Homer or Virgil had a stroke or some neurodegenerative disease and is now doing rubbish during rehabilitation.

loloquwowndueo a day ago | parent [-]

Maybe they would write like that if they existed today. Like the old “if Mozart was born in the 21st century he’d be doing trash metal”

cess11 21 hours ago | parent [-]

Thrash, not "trash". Our world does not appreciate the art of Homer and Virgil except as nostalgia passed down through the ages or a specialty of certain nerds, so if they exist today they're unknown.

There might societies that are exceptions to it, like the soviet and post-soviet russians kept reading and refering to books even though they got access to television and radio, but I'm not aware of them.

Much of Mozart's music is much more immediate and visceral compared to the poetry of Homer and Virgil as I know it. And he was distinctly modern, a freemason even. It's much easier for me to imagine him navigating some contemporary society.

Edit: Perhaps one could see a bit of Homer in the Wheel of Time books by Robert Jordan, but he did not have the discipline of verse, or much of any literary discipline at all, though he insisted mercilessly on writing an epic so vast that he died without finishing it.

ijidak a day ago | parent | prev [-]

That is actually an amazing answer. Better than anything I think I would get from a human. Lol.

Certified a day ago | parent | prev | next [-]

GPT 4.5 seems to get it right, but then repeat the 700 pounds

"A woodchuck would chuck as much wood as a woodchuck could chuck if a woodchuck could chuck wood.

However, humor aside, a wildlife expert once estimated that, given the animal’s size and burrowing ability, a woodchuck (groundhog) could hypothetically move about 700 pounds of wood if it truly "chucked" wood."

https://chatgpt.com/share/680a75c6-cec8-8012-a573-798d2d8f6b...

shaftway a day ago | parent | next [-]

I've heard the answer is "he could cut a cord of conifer but it costs a quarter per quart he cuts".

CamperBob2 19 hours ago | parent | prev [-]

That answer is exactly right, and those who say the 700 pound thing is a hallucination are themselves wrong: https://chatgpt.com/share/680aa077-f500-800b-91b4-93dede7337...

wolfgang42 17 hours ago | parent [-]

Linking to ChatGPT as a “source” is unhelpful, since it could well have made that up too. However, with a bit of digging, I have confirmed that the information it copied from Wikipedia here is correct, though the AP and Spokane Times citations are both derivative sources; Mr. Thomas’s comments were first published in the Rochester Democrat and Chronicle, on July 11, 1988: https://democratandchronicle.newspapers.com/search/results/?...

CamperBob2 16 hours ago | parent [-]

Linking to ChatGPT as a “source” is unhelpful, since it could well have made that up too

No, it absolutely is helpful, because it links to its source. It takes a grand total of one additional click to check its answer.

Anyone who still complains about that is impossible to satisfy, and should thus be ignored.

once_inc 8 hours ago | parent | prev | next [-]

I loved this dialogue in Monkey Island 2, where this is basically the first NPC you talk to, and the dialogue options get wordier and wordier to the point of overflowing all screen real-estate. Perfectly sets the stage for the remainder of the game.

mdp2021 21 hours ago | parent | prev | next [-]

It seems you are going in the opposite direction. You seem to be asking for an automatic response, a social password etc.

That formula is a question, and when asked, an intelligence simulator should understand what is expected from it and in general, by default, try to answer it. That involves estimating the strength of a woodchuck etc.

mwest217 a day ago | parent | prev | next [-]

Gemini 2.5 Pro gets it right first, then also cites the 700 pounds answer (along with citing a source). https://g.co/gemini/share/c695a0163538

ishandotpage 12 hours ago | parent | prev | next [-]

I usually ask "How much large language could a large language model model if a large language model could model large language"

Not one has given me the correct answer yet.

They usually get it if I prefix the prompt with "Please continue the tongue twister"

segmondy a day ago | parent | prev | next [-]

my local model answered - "A woodchuck would chuck as much wood as a woodchuck could chuck if a woodchuck could chuck wood."

jacobsenscott 20 hours ago | parent | prev | next [-]

"He would chuck, he would, as much as he could, if a wood chuck could chuck wood" is how I learned it.

mcshicks a day ago | parent | prev | next [-]

That's so funny I had to check something was working with an llm API last night and that's what I asked it, but just in jest.

unavoidable a day ago | parent | prev | next [-]

On the other hand, now that you've written this out precisely, it will get fed into the next release of whatever LLM. Like reverse AI slop?

moffkalast a day ago | parent | prev [-]

Now I'm wondering if it makes any difference if this was asked through the audio encoder on a multimodal model. A tongue twister means nothing to a text-only model.