Remix.run Logo
jkingsman 2 days ago

Mmm, I see this cutting both ways -- generally, I'd agree; safety critical things should not be left to an AI. However, cooking temperatures are information that has a factual ground truth (or at least one that has been decided on), has VERY broad distribution on the internet, and generally is a single, short "kernel" of information that has become subject to slop-ifying and "here's an article when you're looking for about 30 characters of information or less" that is prolific on the web.

So, I'd agree -- safety info from an LLM is bad. But generally, the /flavor/ (heh) of information that such data comprises is REALLY good to get from LLMs (as opposed to nuanced opinions or subjective feedback).

Velorivox 2 days ago | parent [-]

I don’t know. I searched for how many chapters a popular manga has on Google and it gave me the wrong answer (by an order of magnitude). I only found out later and it did really piss me off because I made a trek to buy something that never existed. I should’ve known better.

I don’t think this is substantively different from cooking temperature, so I’m not trusting that either.

jkingsman 13 hours ago | parent [-]

Eh I think it is. Arcane things -- sure, that might be a bit of a stretch. My general rule of thumb is that if I would expect ~10% of people to know the information factually, I can likely trust what an LLM tells me.

Velorivox 11 hours ago | parent [-]

Really?

[0] https://i.imgur.com/ly5yk9h.png