| ▲ | psychoslave 2 days ago | ||||||||||||||||||||||||||||||||||||||||
> You'll find that the resulting output becomes incoherent garbage. I also do that kind of things with LLM. The other day, I don't remember the prompt (something casual really, not trying to trigger any issue) but le chat mistral started to regurgitate "the the the the the...". And this morning I was trying a some local models, trying to see if they could output some Esperanto. Well, that was really a mess of random morphs thrown together. Not syntactically wrong, but so out of touch with any possible meaningful sentence. | |||||||||||||||||||||||||||||||||||||||||
| ▲ | lotyrin 2 days ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||
Yeah, some of the failure modes are the same. This one in particular is fun because even a human, given "the the the" and asked to predict what's next will probably still answer "the". How a Markov chain starts the the train and how the LLM does are pretty different though. | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||