| ▲ | gehsty 2 hours ago | |
LLMs are word prediction engines. They clearly are not conscious, they are just guessing what words should come next. | ||
| ▲ | thebruce87m 21 minutes ago | parent | next [-] | |
> They clearly are not conscious Consciousness is emergent. A human is not conscious by our definition until the moment they are. How will we be able to identify the singularity when it comes? I feel like this is what the article is really addressing. > LLMs are word prediction engines Humans can also do this too, so what are the missing parts for consciousness? Close a few loops on learning pipeline and we might be there. | ||
| ▲ | charlie90 41 minutes ago | parent | prev | next [-] | |
The human brain is an electrical signal prediction machine. Anything that looks like intelligence will look like a prediction machine because the alternative is logic being hardcoded apriori. | ||
| ▲ | zwischenzug an hour ago | parent | prev [-] | |
How do we know that that isn't essentially how our minds work? | ||