Been seeing this argument more and more. Is there a name for it?
“Bad thing X has been happening forever.
AI _guarantees_ X will exponentially get worse, but it lets me do (arguably) good thing Y so it’s okay.”