Remix.run Logo
skate 4 days ago

As others pointed out this problem isn't special.

Grok 4 heavy Thought for 4m 17s

{"decoded_prefix": "nqxznhzhvqvvjddqiterrqdboctzzmoxmhyzlcfe", "last_10": "kfohgkrkoj", "vowel_counts": {"a": 7, "e": 18, "i": 7, "o": 12, "u": 6}}

it did count another e, but that's a known point of failure for LLMs which i assume you put in intentionally.

>Counting e's shows at least 10 more, so total e's are <at least> 17.

ripped_britches 3 days ago | parent [-]

I guess GPT-5 with thinking is still a bit ahead of grok. I wonder what the secret sauce is.

3 days ago | parent [-]
[deleted]