Remix.run Logo
rented_mule a day ago

> Even the Google "AI" knows better than that. CSAM "is [...]"

Please don't use the "knowledge" of LLMs as evidence or support for anything. Generative models generate things that have some likelihood of being consistent with their input material, they don't "know" things.

Just last night, I did a Google search related to the cell tower recently constructed next to our local fire house. Above the search results, Gemini stated that the new tower is physically located on the Facebook page of the fire department.

Does this support the idea that "some physical cell towers are located on Facebook pages"? It does not. At best, it supports that the likelihood that the generated text is completely consistent with the model's input is less than 100% and/or that input to the model was factually incorrect.

chrisjj a day ago | parent [-]

Thanks. For a moment I slipped and fell for the "AI" con trick :)