Remix.run Logo
IAmGraydon 13 hours ago

Things which are both powerful and possible become inevitable. We know that LLMs are powerful, but we aren't sure how powerful yet, and there's a large range this might eventually land in. We know they're possible in their current form, of course, but we don't know if actual GAI is possible.

At this time, humanity seems to be estimating that both power and possibility will be off the charts. Why? Because getting this wrong can be so negatively impactful that it makes sense to move forward as if GAI will inevitably exist. Imagine supposing that this will all turn out to be fluff and GAI will never work, so you stop investing in it. Now imagine what happens if you're wrong and your enemy gets it to work first.

This isn't some arguing device for AI-inevitabilists. It's knowledge of human nature, and it's been repeating itself for millennia. If the author believes that's going to suddenly change, they really should back that up with what, exactly, has changed in human nature.