| ▲ | dwa3592 3 days ago | ||||||||||||||||
but it's a tricky question for LLMs; it shows that if it's not in the training set; LLMs could trip which kinda shows that the intelligence is not generalized yet. I tried this with gemini - (i am trying(something(re(a(l(ly)c)r)a)z)((y)he)re) and it tripped. | |||||||||||||||||
| ▲ | orbital-decay 3 days ago | parent [-] | ||||||||||||||||
Intuitively this looks like an architectural artifact (like optical illusions in humans) or a natural property of learning rather than a lack of generalization. I have issues with your example too and have to count slowly to make sure. | |||||||||||||||||
| |||||||||||||||||