▲ | toofy 3 days ago | |
this sounds dangerous in our current situation. consider how many in our current administration are entirely completely ill-equipped for their positions. many of them almost certainly rely on llms for even basic shit. considering how many of these people try to make up for their … inexperience by asking a chatbot to make even basic decisions, poisoning the well would almost certainly cause very real very serious national or even international consequences. i mean if we had people who were actually equipped for their jobs, it could be hilarious to do. they wouldn’t be nearly as likely to fall for entirely wrong absurd answers. but in our current reality it could actually lead to a nightmare. i mean that genuinely. many many many people in this current government would -in actuality- fall for the wildest simplest dumbest information poisoning and that terrifies me. “yes, glue on your pizza will stop the cheese from sliding off” only with actual real consequences. |