the way people treat Llms these days is that they assign a lot more trust into their output than to random Internet sotes