▲ | Jun8 a day ago | |||||||||||||
ACT post where Scott Alexander provides some additional info: https://www.astralcodexten.com/p/introducing-ai-2027. Manifold currently predicts 30%: https://manifold.markets/IsaacKing/ai-2027-reports-predictio... | ||||||||||||||
▲ | Aurornis a day ago | parent | next [-] | |||||||||||||
> ACT post where Scott Alexander provides some additional info: https://www.astralcodexten.com/p/introducing-ai-2027 The pattern where Scott Alexander puts forth a huge claim and then immediately hedges it backward is becoming a tiresome theme. The linguistic equivalent of putting claims into a superposition where the author is both owning it and distancing themselves from it at the same time, leaving the writing just ambiguous enough that anyone reading it 5 years from now couldn't pin down any claim as false because it was hedged in both directions. Schrödinger's prediction. > Do we really think things will move this fast? Sort of no > So maybe think of this as a vision of what an 80th percentile fast scenario looks like - not our precise median, but also not something we feel safe ruling out. The talk of "not our precise median" and "Not something we feel safe ruling out" is an elaborate way of hedging that this isn't their actual prediction but, hey, anything can happen so here's a wild story! When the claims don't come true they can just point back to those hedges and say that it wasn't really their median prediction (which is conveniently not noted). My prediction: The vague claims about AI becoming more powerful and useful will come true because, well, they're vague. Technology isn't about to reverse course and get worse. The actual bold claims like humanity colonizing space in the late 2020s with the help of AI are where you start to realize how fanciful their actual predictions are. It's like they put a couple points of recent AI progress on a curve, assumed an exponential trajectory would continue forever, and extrapolated from that regression until AI was helping us colonize space in less than 5 years. > Manifold currently predicts 30%: Read the fine print. It only requires 30% of judges to vote YES for it to resolve to YES. This is one of those bets where it's more about gaming the market than being right. | ||||||||||||||
▲ | leonidasv 21 hours ago | parent | prev | next [-] | |||||||||||||
> Do we really think things will move this fast? Sort of no - between the beginning of the project last summer and the present, Daniel’s median for the intelligence explosion shifted from 2027 to 2028. We keep the scenario centered around 2027 because it’s still his modal prediction (and because it would be annoying to change). Other members of the team (including me) have medians later in the 2020s or early 2030s, and also think automation will progress more slowly. So maybe think of this as a vision of what an 80th percentile fast scenario looks like - not our precise median, but also not something we feel safe ruling out. Important disclaimer that's lacking in OP's link. | ||||||||||||||
▲ | whiddershins 11 hours ago | parent | prev | next [-] | |||||||||||||
> A rise in AI-generated propaganda failed to materialize. hah! | ||||||||||||||
▲ | crazystar a day ago | parent | prev [-] | |||||||||||||
47% now soo a coin toss | ||||||||||||||
|