| ▲ | tux3 an hour ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The true AI doomsayers believe in some sort of technological singularity, which means a point after which things become so strange that the world is radically transformed. Things like "jobs" and "careers" are so integral to society that we can't really imagine what society would be like in a world where people don't have any clear purpose. That's why you won't get a definitive answer. The whole idea of a singularity is that people don't have the faintest clue what day to day life would look like after. We often to choose to believe that a singularity can't happen, because we don't know what that even means. We can't answer the simple question. So it definitely better not happen, that would be very inconvenient. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | garciasn an hour ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I’m always amazed that when I tell people I intend to retire in my 50s, they tell me that I can’t possibly mean that and actively wonder how I could possibly fill my time. It’s as if we could not possibly function as humans without meaningless shifting of tangible/intangibles from one place to another. Society is so hellbent on the idea that we need our job to be our identity, they lack the imagination for another other reality. It’s ridiculous. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | woeirua an hour ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I believe that AI will continue to progress. I believe that we’re going to see a fast takeoff. That said, some people are now discussing a “societal singularity” wherein society breaks before the actual emergence of AGI. I believe this is the trajectory we are on. The question is what happens to the unemployed. Democracies will not tolerate mass permanent unemployment, as we’ve seen over and over again. UBI is a scam, many middle class folks would be worse off under UBI than they are under the current system. They will fight to defend the economic status quo. In the end, I think capitalism is incompatible with the emergence of AGI, and I think an aligned ASI will smash the capitalist system simply out of pure egalitarianism. (Note: I was previously a proponent of capitalism.) I think many people will die trying to defend capitalism. We’re at the beginning of the AI wars. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | visarga an hour ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It can't happen. For one - if it did happen it would mean all domains reach singularity at once, but we know the capability curve is jagged. Each domain advances at its own speed. Second - the more you make progress, the harder it gets, exponentially harder. Maybe Newton could advance physics observing an apple fall, today they need space telescopes and billion dollar particle accelerators. The more tech advances, the harder it is. Will AGI be so "super" to cancel out exponentials? And third - the AI progress is tied to learning signal, and we have exhausted the available data. In the last 1-2 years we have started using verified synthetic data (RLVR) but exponential difficulty is a barrier. Other domains don't even have built in verifiability like math and code. So there the progress will be slower. Testing a vaccine to be safe takes 6 months for 1 bit of information - that is how slow and expensive it can get in some domains. AI can't get the learning signal it needs across all domains fast enough. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||