▲ | K0balt 6 days ago | |||||||
It gave you the tallest mountain every time. You kept asking it for various numbers of “tallest mountains” and each time it complied. You asked it to enumerate several mountains by height, and it also complied. It just didn’t understand that when you said the 6 tallest mountains that you didn’t mean the tallest mountain, 6 times. When you used clearer phrasing it worked fine. It’s 270m. It’s actually a puppy. Puppies can be trained to do cool tricks, bring your shoes, stuff like that. | ||||||||
▲ | littlestymaar 6 days ago | parent [-] | |||||||
> asking it for various numbers of “tallest mountains” and each time it complied That's not what “second tallest” means thought, so this is a language model that doesn't understand natural language… > You kept asking Gemma 270m isn't the only one to have reading issues, as I'm not the person who conducted this experiment… > You asked it to enumerate several mountains by height, and it also complied. It didn't, it hallucinated a list of mountains (this isn't surprising though, as this is the kind of encyclopedic knowledge such a small model isn't supposed to be good at). | ||||||||
|