▲ | bell-cot 7 days ago | |||||||
My interpretation: When they say "will lead to human extinction", they are trying to vocalize their existential terror that an AGI would render them and their fellow rationalist cultists permanently irrelevant - by being obviously superior to them, by the only metric that really matters to them. | ||||||||
▲ | lostmsu 7 days ago | parent [-] | |||||||
You sound like you wouldn't feel existential terror if after typing "My interpretation: " into the text field you'd see the rest of your message suggested by Copilot exactly how you wrote it letter by letter. And the same in every other conversation. How about people interrupting you in "real" life interaction after an AI predicted your whole tirade for them and they read it faster than you said it, and also read an analysis of it? Dystopian sci-fi for sure, but many people dismissing LLMs as not AGI do so because LLMs are just "token predictors". | ||||||||
|