Remix.run Logo
praveeninpublic a day ago

I cannot do math faster in my head, calculators killed the faster mental math, but if I have to calculate on my own, it's not fundamentally impossible. I still can do it, because it's just computation.

But, LLMs help us think, which is much more than just computing, that's more dependency.

aaronbaugher a day ago | parent | next [-]

I suppose that depends what you do with them. I spent some time this weekend using Grok to work on a business plan and some other projects. I find myself using it for research, quickly winnowing information down to what's relevant to my needs, and sort of bouncing ideas off it the way I would with another person. I always have to keep in mind that it could get something wrong, but then again, so could a person.

I don't think it's helping me think; more like it's helping me organize my thoughts and find inspiration. I suppose others might use it in a more dependent way.

praveeninpublic a day ago | parent [-]

Fair point. As a software engineer using Cursor, I’ve noticed it writes most of the code now. It’s easy to accept without review, which builds dependency. My role feels less like just coding and more like PM, tester, and reviewer combined.

We’ve started trusting AI like calculators. So, assuming it's right without checking. But LLMs can be confidently wrong, and once that habit sets in, even the “ChatGPT can make mistakes” warning fades into the background.

temp0826 a day ago | parent | prev [-]

My calculator doesn't hallucinate (said another way- pending no input error, I can blindly trust it...which is something that would get me in trouble with a LLM)