▲ | JambalayaJimbo 4 days ago | |||||||
LLMs do not have brains and there is no evidence as far as I know that they "think" like human beings do. | ||||||||
▲ | gnulinux 4 days ago | parent | next [-] | |||||||
LLMs do not reason at all (i.e. deductive reasoning using a formal system). Chain of thought etc simulate reasoning by smoothing out the path to target tokens by adding shorter stops on the way. | ||||||||
| ||||||||
▲ | ACCount36 3 days ago | parent | prev | next [-] | |||||||
LLMs are only capable of performing a finite amount of computation within a single forward pass. We know that much. They are also known to operate on high level abstracts and concepts - unlike systems operating strictly on formal logic, and very much like humans. | ||||||||
▲ | appreciatorBus 3 days ago | parent | prev [-] | |||||||
That being true does not mean that there are no limits to whatever it might be doing, which might be wasted with ambiguous naming schemes. I am far from an AI booster or power user but in my experience, I get much better results with descriptive identifier names. |