▲ | CuriouslyC 4 days ago | ||||||||||||||||
That's an oversimplification. AI can genuinely expand the scope of things you can do. How it does this is a bit particular though, and bears paying attention to. Normally, if you want to achieve some goal, there is a whole pile of tasks you need to be able to complete to achieve it. If you don't have the ability complete any one of those tasks, you will be unable to complete the goal, even if you're easily able to accomplish all the other tasks involved. AI raises your capability floor. It isn't very effective at letting you accomplish things that are meaningfully outside your capability/comprehension, but if there are straightforward knowledge/process blockers that don't involve deeper intuition it smooths those right out. | |||||||||||||||||
▲ | jvanderbot 4 days ago | parent | next [-] | ||||||||||||||||
Normally, one would learn the missing steps, with or without AI. You're probably envisioning a more responsible use if it (floor raising, "meaningfully inside your comprehension"), that is actually not what I'm referring to at all ( "assumes everything that tool might be used for is now a problem or domain they are an expert in"). A meta tech can be used in many ways and yours is close to what I believe the right method is. But I'm asserting that the danger is massive over reliance and over confidence in the "transferability". | |||||||||||||||||
▲ | monkeyelite 4 days ago | parent | prev [-] | ||||||||||||||||
> If you don't have the ability complete any one of those tasks, you will be unable to complete the goal Nothing has changed. Few projects start with you knowing all the answers. In the same way AI can help you learn, you can learn from books, colleagues, and trial and error for tasks you do not know. | |||||||||||||||||
|