▲ | TheOtherHobbes 5 days ago | |
IQ is mostly a measure of processing speed and memory, with some educational bias that's hard to filter out. You don't get useful intelligence unless the software is also fit for purpose. Slow hardware can still outperform broken software. Social status depends on factors like good looks, charm, connections, and general chutzpah, often with more or less overt hints of narcissism. That's an orthogonal set of skills to being able to do tensor calculus. As for an impending AI singularity - no one has the first clue what the limits are. We like to believe in gods, and we love stories about god-like superpowers. But there are all kinds of issues which could prevent a true singularity - from stability constraints on a hypercomplex recursive system, to resource constraints, to physical limits we haven't encountered yet. Even if none of those are a problem, for all we know an ASI may decide we're an irrelevance and just... disappear. | ||
▲ | logicchains 5 days ago | parent [-] | |
>As for an impending AI singularity - no one has the first clue what the limits are. That's simply untrue. Theoretical computer scientists understand the lower bounds limits of many classes of problems. And that for many problems, it's mathematically impossible to significantly improve performance in them with only a linear increase in computing power, regardless of the algorithm/brain/intelligence. Many problems would even not benefit much from a superlinear increase in computing power, because of the nature of exponential growth. For a chaotic system in the mathematical sense, where prediction grows exponentially harder with time, even exactly predicting one minute ahead could require more compute than could be provided by turning the entire known universe into a computer. |