Remix.run Logo
gonzobonzo 4 days ago

> I’ve found ChatGPT and other LLMS can struggle to evaluate evidence - to understand the biases behind sources - ie taking data from a sketchy think tank as gospel.

This is what I keep finding, it mostly repeats surface level "common knowledge." It usually take a few back and forths to get to whether or not something is actually true - asking for the numbers, asking for the sources, asking for the excerpt from the sources where they actually provide that information, verifying to make sure it's not hallucinating, etc. A lot of the time, it turns out its initial response was completely wrong.

I imagine most people just take the initial (often wrong) response at face value, though, especially since it tends to repeat what most people already believe.

athrowaway3z 4 days ago | parent [-]

> It usually take a few back and forths to get to whether or not something is actually true

This cuts both ways. I have yet to find an opinion or fact I could not make chatgpt agree with as if objectivly true. Knowing how to trigger (im)partial thought is a skill in and of itself and something we need to be teaching in school asap. (Which some already are in 1 way or another)

gonzobonzo 4 days ago | parent | next [-]

I'm not sure teaching it in school is actually going to help. Most people will tell you that of course you need to look at primary sources to verify claims - and then turn around and believe the first thing they here from LLM, Redditor, Wiki article, etc. Even worse, many people get openly hostile to the idea that people should verify claims - "what, you don't believe me?"/"everyone here has been telling you this is true, do you have any evidence it isn't?"/"oh, so you think you know better?"

There was a recent discussion about Wikipedia here recently where a lot of people who are active on the site argued against people taking the claims there with a grain of salt and verifying the accuracy for themselves.

We can teach these things until the cows come home, but it's not going to make a difference if people say it's a good idea and then immediately do the opposite.

Kim_Bruning 4 days ago | parent [-]

There were actual Wikipedians arguing not to take a wiki with a grain of salt? If I was in that discussion, I must have missed those posts. Can you link an example?

If you mean whether Wikipedia is unreliable? That's a different story, everything is unreliable. Wikipedia just happens to be potentially less unreliable than many (typically) (if used correctly) (#include caveats.h) .

Sources are like power tools. Use them with respect and caution.

eru 4 days ago | parent | prev [-]

> Knowing how to trigger (im)partial thought is a skill in and of itself and something we need to be teaching in school asap.

You are very optimistic.

Look at all other skills we are trying to teach in school. 'Critical thinking' has been at the top of nearly every curriculum you can point a finger at for quite a while now. To minimal effect.

Or just look at how much math we are trying to teach the kids, and what they actually retain.

athrowaway3z 4 days ago | parent [-]

Perhaps a bit optimistic, but this can be shown in real time: the situation, cause, and effect.

Critical thinking is a much more general skill which is applicable anywhere, thus quicker to be 'buried' under other learned behavior.

This skill has an obvious trigger; you're using AI, which means you should be aware of this.