Remix.run Logo
NegativeK 3 hours ago

I think you're interpreting the commenter's/article's point in a way that they didn't intend. At all.

Assume the LLM has the answer a student wants. Instead of just blurting it out to the student, the LLM can:

* Ask the student questions that encourages the student to think about the overall topic.

* Ask the student what they think the right answer is, and then drill down on the student's incorrect assumptions so that they arrive at the right answer.

* Ask the student to come up with two opposing positions and explain why each would _and_ wouldn't work.

Etc.

None of this has to get anywhere near politics or whatever else conjured your dystopia. If the student asked about politics in the first place, this type of pushback doesn't have to be any different than current LLM behavior.

In fact, I'd love this type of LLM -- I want to actually learn. Maybe I can order one to actually try..

ForceBru 2 hours ago | parent [-]

In fact, I agree with the article! For instance, many indeed offload thinking to LLMs, potentially "leading to the kind of cognitive decline or atrophy more commonly associated with aging brains". It also makes sense that students who use LLMs are not "learning to parse truth from fiction ... not learning to understand what makes a good argument ... not learning about different perspectives in the world".

Somehow "pushing back against preconceived notions" is synonymous to "correcting societal norms by means of government-approved LLMs" for me. This brings politics, dystopian worlds and so on. I don't want LLMs to "push back against preconceived notions" and otherwise tell me what to think. This is indeed just one sentence in the article, though.