Remix.run Logo
Aurornis 18 hours ago

This fails the classic conspiracy theory test: Any company practicing this would have to be large enough to be able to afford to orchestrate a chain of illegal transactions to get the data, develop a process for using it in hiring, and routinely act upon it.

The continued secrecy of the conspiracy would then depend on every person involved in orchestrating this privacy violation and illegal hiring scheme keeping it secret forever. Nobody ever leaking it to the press, no disgruntled employees e-mailing their congress people, no concerned citizens slipping a screenshot to journalists. Both during and after their employment with the company.

To even make this profitable at all, the data would have to be secretly sold to a lot of companies for this use, and also continuously updated to be relevant. Giant databases of your secret ChatGPT queries being sold continuously in volume, with all employees at both the sellers, the buyers, and the users of this information all keeping it perfectly quiet, never leaking anything.

drawnwren 16 hours ago | parent [-]

It doesn't though. As an aside, I have been using a competitor to chatgpt health (nori) for a while now, and I have been getting an extreme amount of targeted ads about HRV and other metrics that the app consumes. I have been collecting health metrics through wearables for years, so there has been no change in my own search patterns or beliefs about my health. I just thought ai + health data was cool.