▲ | jvanderbot 5 days ago | |||||||||||||||||||||||||||||||
What happens is a kind of feeling of developing a meta skill. It's tempting to believe the scope of what you can solve has expanded when you are self-assessed as "good" with AI. Its the same with any "general" tech. I've seen it since genetic algorithms were all the rage. Everyone reaches for the most general tool, then assumes everything that tool might be used for is now a problem or domain they are an expert in, with zero context into that domain. AI is this times 100x, plus one layer more meta, as you can optimize over approaches with zero context. | ||||||||||||||||||||||||||||||||
▲ | CuriouslyC 4 days ago | parent [-] | |||||||||||||||||||||||||||||||
That's an oversimplification. AI can genuinely expand the scope of things you can do. How it does this is a bit particular though, and bears paying attention to. Normally, if you want to achieve some goal, there is a whole pile of tasks you need to be able to complete to achieve it. If you don't have the ability complete any one of those tasks, you will be unable to complete the goal, even if you're easily able to accomplish all the other tasks involved. AI raises your capability floor. It isn't very effective at letting you accomplish things that are meaningfully outside your capability/comprehension, but if there are straightforward knowledge/process blockers that don't involve deeper intuition it smooths those right out. | ||||||||||||||||||||||||||||||||
|