Remix.run Logo
fauigerzigerk 2 days ago

I used to donate to Wikipedia, but it has been completely overrun by activists pushing their preferred narrative. I don't trust it any more.

I guess it had to happen at some point. If a site is used as ground truth by everyone while being open to contributions, it has to become a magnet and a battleground for groups trying to influence other people.

LLMs don't fix that of course. But at least they are not as much a single point of failure as a specific site can be.

notarobot123 2 days ago | parent | next [-]

> at least they are not as much a single point of failure

Yes, network effects and hyper scale produce perverse incentives. It sucks that Wikipedia can be gamed. Saying that, you'd need to be actively colluding with other contributors to maintain control.

Imagining that AI is somehow more neutral or resistant to influence is incredibly naive. Isn't it obvious that they can be "aligned" to favor the interests of whoever trains them?

fauigerzigerk 2 days ago | parent [-]

>Imagining that AI is somehow more neutral or resistant to influence is incredibly naive

The point is well taken. I just feel that at this point in time the reliance on Wikipedia as a source of objective truth is disproportionate and increasingly undeserved.

As I said, I don't think AI is a panacea at all. But the way in which LLMs can be influenced is different. It's more like bias in Google search. But I'm not naive enough to believe that this couldn't turn into a huge problem eventually.

ramon156 2 days ago | parent | prev | next [-]

Can I ask for some examples? I'm not this active on Wikipedia, so I'm curious where a narrative is being spread

kristjank 2 days ago | parent | next [-]

Franklin Community Credit Union scandal is a good example, well outlined in this youtuber's (admittedly dramatized) video: https://www.youtube.com/watch?v=F0yIGG-taFI

n4r9 2 days ago | parent [-]

Is their argument documented anywhere in text, rather than an 8 minutes video?

hbogert 14 hours ago | parent [-]

ask chatgpt to summarize

notarobot123 2 days ago | parent | prev | next [-]

Here are some examples from which you can extrapolate the more serious cases: https://en.wikipedia.org/wiki/Wikipedia:Lamest_edit_wars

fauigerzigerk 2 days ago | parent | prev [-]

I thought about giving examples because I understand why people would ask for them, but I decided very deliberately not to give any. It would inevitably turn into a flame war about the politics/ethics of the specific examples and distract from the reasons why I no longer trust Wikipedia.

I understand that this is unsatisfactory, but the only way to "prove" that the motivations of the people contributing to Wikipedia have shifted would be to run a systematic study for which I have neither the time nor the skills nor indeed the motivation.

Perhaps I should say that am a politically centrist person whose main interests are outside of politics.

junek 2 days ago | parent [-]

Let me guess: you hold some crank views that aren't shared by the people who maintain Wikipedia, and you find that upsetting? That's not a conspiracy, it's just people not agreeing with you.

fauigerzigerk 2 days ago | parent [-]

Your guess is incorrect. I'm keeping well away from polarised politics as well as anti-scientific and anti-intellectual fringe views.

Panzer04 2 days ago | parent | prev [-]

Single point of failure?

Yeah u can download the entirety of Wikipedia if you want to. What's the single point of failure?

fauigerzigerk 2 days ago | parent [-]

Not in a technical sense. What I mean is that Wikipedia is very widely used as an authoritative source of objective truth. Manipulating this single source regarding some subject would have an outsize influence on what is considered to be true.