| ▲ | infecto 5 days ago |
| Everyone is different. I don’t have a good grasp on the distribution of HN readers these days but I know for myself as a heavy user of LLMs, I am not sold on this for myself. I am asking more questions than ever. I use it for proof reading and editing. But I can see the risk as a software engineer. I really appreciate tools like cursor, I give it bite size chunks and review. Using tools like Claude code though. It becomes a black box and I no longer feel at the helm of the ship. I could see if you outsourced all thinking to an LLM there can be consequences. That said I am not sold on the paper and suspects it’s mostly hyperbole. |
|
| ▲ | Taek 5 days ago | parent | next [-] |
| Cognitive decline is a broad term, and a research paper could claim "decline" if even a single cognitive metric loses strength. When writing was invented, societies started depending on long form memorization less, which is a cognitive "decline". When calculators were invented, societies started depending on mental math less, which is a cognitive "decline". I'm sure LLMs are doing the same thing. People aren't getting dumber, they are just outsourcing tasks more, so that their brains spend more time on the tasks that can't be outsourced. |
| |
| ▲ | yuehhangalt 5 days ago | parent | next [-] | | My concern is more attributed to the tasks that can't or won't be outsourced. People who maintain a high level of curiosity or a have drive to create things will most assuredly benefit from using AI to outsource work that doesn't support those drives. It has the potential to free up more time for creative endeavors or those that require more deep thinking. Few would argue the benefit there. Unfortunately, anti-intellectualism is rampant, media literacy is in decline, and a lot of people are content to consume content and not think unless they absolutely have to. Dopamine is a helluva drug. If LLMs reduce the cognitive effort at work, and the people go home to doom scroll on social media or veg out in front of their streaming media of choice, it seems that we're heading down the path of creating a society of mindless automatons. Idiocracy is cited so often today that I hate to do so myself, but it seems increasingly prescient. Edit: I also don't think that AI will enable a greater work-life harmony. The pandemic showed that a large number of jobs could effectively be done remotely. However, after the pandemic, there was significant "Return to Office" movement that almost seemed like retribution for believing we could achieve a better balance. Corporations won't pass on the time savings to their employees and enable things like 4-day work weeks. They'll simply expect more productivity from the employees they have. | |
| ▲ | IAmBroom 5 days ago | parent | prev | next [-] | | Absolutely true. Also, domesticated dogs show indications of lower intelligence and memory than wolves. They don't have to plan complex strategies to find and kill food, anymore. | | |
| ▲ | Taek 5 days ago | parent [-] | | The difference between us and dogs is that we DO still need to make a salary. Dogs live in a lap of luxury where their needs are guaranteed to be handled. But humans need jobs, and jobs need to capture value from society. So we do actually still have to stay sharp, whatever form "sharp" takes. | | |
| ▲ | pessimizer 5 days ago | parent [-] | | You and dogs have the same job, which is to please the boss. The boss then takes care of you like a child, either with a paycheck (with which you can pay servants to supply your earthly needs), or directly if you're a dog and lack both thumbs and pockets to hold a wallet or a phone. A domestic dog would die left alone in a forest, about two or three weeks after you would. If you're an entrepreneur, your job is to please the customer and to squeeze your vendors and employees. You still take little to no part in directly taking care of yourself, except as a hobby. Unless you want to be congratulated for wiping your own ass or lifting a fork to your mouth. |
|
| |
| ▲ | infecto 5 days ago | parent | prev [-] | | This is super interesting and I had not thought about it like that! |
|
|
| ▲ | ceejayoz 5 days ago | parent | prev | next [-] |
| > I am asking more questions than ever. Wouldn't that be the expected result here? Less knowledge, more questions? |
| |
| ▲ | infecto 5 days ago | parent | next [-] | | That’s one interpretation, but I think there’s a distinction between “asking more questions because I’ve forgotten things” and “asking more questions because I’m exploring further.” When I use LLMs, it’s less about patching holes in my memory and more about taking an idea a few steps further than I otherwise might. For me it’s expanding the surface area of inquiry, not shrinking it. If the study’s thesis were true in my case, I’d expect to be less curious, not more. Now that said I also have a healthy dose of skepticism for all output but I find for the general case I can at least explore my thoughts further than what I may have done in the past. | |
| ▲ | rwnspace 5 days ago | parent | prev [-] | | In my personal experience new knowledge tends to beget questions. |
|
|
| ▲ | xnorswap 5 days ago | parent | prev [-] |
| > I am asking more questions than ever. I don't have a dog in this fight, but "asking more questions" could be evidence of cognitive decline if you're having to ask more questions than ever! It's easy to twist evidence to fit biases, which is why I'd hold judgement to better evidence comes through. |
| |
| ▲ | IAmBroom 5 days ago | parent | next [-] | | Well, that's certainly a take. But if I'm teaching a class, and one student keeps asking questions that they feel the material raised, I don't tend to think "brain damage". I think "engaged and interested student". | |
| ▲ | charlie-83 5 days ago | parent | prev | next [-] | | Not OP but there's a difference between needing to ask more questions and asking more questions because its easier now. Personally, I find myself often asking AI about things I wouldn't have been bothered to find out about before. For example I've always these funny little grates on the outside of houses near me and wondered what they are. Googling "little grates outside houses" doesn't help at all. Give AI a vagueish description and it instantly tells you they are old boot scapers. | | |
| ▲ | infecto 5 days ago | parent [-] | | Haha you nailed it. Walking around and experiencing the world I can now ask a vague question and usually find an answer. Maybe there is a movie in the back of my head or a song. Typical search engine queries would never find it. I can give super vague references to a LLM and with search enabled get an answer that’s correct often enough. | | |
| ▲ | danenania 5 days ago | parent [-] | | The ability to keep following the thread and interrogating the answers is also very valuable. You never have to accept an answer you only half understand. |
|
| |
| ▲ | infecto 5 days ago | parent | prev [-] | | Fair point, though I think there’s a difference between “questions out of confusion” and “questions out of curiosity.” If I’m constantly asking “what does this mean again?” that would signal decline. But if I’m asking “what if I combine this with X?” or “what are the tradeoffs of Y?” that feels like the opposite: more engagement, not less. That’s why I’m skeptical of blanket claims from one study, the lived experience doesn’t map so cleanly. |
|