Remix.run Logo
cbg0 7 hours ago

> OpenAI did this with your health data in January. Now it wants your financial data too.

This is far more valuable, they can see what political affiliation you have based on your campaign donations, predict things like cheating on your wife & the impending divorce, what vices you have and they can also build shadow profiles of all the people you give and receive money from even if they don't use the product.

lxgr 6 hours ago | parent | next [-]

I’d be willing to bet that ChatGPT will know the average user’s political affiliation and vices about three messages in.

The difference is that banking records are harder to falsify, so there’s that.

rixed 7 hours ago | parent | prev | next [-]

If all they wanted was to know more about your profile, they could already buy this information form the bank I presume.

arrosenberg 7 hours ago | parent | prev | next [-]

Campaign donations are already public if you donate over $200 - https://www.opensecrets.org/donor-lookup

fontain 7 hours ago | parent | prev | next [-]

it is far more valuable to know the type of boring things boring people buy in their boring daily lives

gruez 7 hours ago | parent | prev [-]

>they can see what political affiliation you have based on your campaign donations

You can get a pretty good estimate just by looking at other demographic factors like age, education level, income, and zip code. Moreover, how many people actually donate to campaigns?

>predict things like cheating on your wife & the impending divorce, what vices you have and they can also build shadow profiles of all of the people you give and receive money from even if they don't use the product.

Google has all this capability for at least a decade. What concrete harms have actually materialized?

kridsdale1 7 hours ago | parent [-]

OpenAI is now run by former Meta executives.

gruez 6 hours ago | parent [-]

Okay, what concrete harms has Meta done with this information? At best you have some creeps using it to stalk their exes, which is bad, but a far cry from the AI takeover scenario implied by OP.

rurp 2 hours ago | parent | next [-]

They target toxic ads at people with poor mental health who are especially vulnerable. They do this intentionally because it's profitable.

There's plenty of reporting on this if you care to look it up. It "works" too. Spending more time on Meta products results in having more body issues, poor self esteem, and suicidal ideations.

But if I remember right you work for a big ad tech company and have previously gone to the mat to defend such practices, so I suspect you aren't genuinely asking.

cbg0 6 hours ago | parent | prev [-]

I haven't implied an AI takeover, this data will be repackaged into a product for military/intelligence, political applications, insurance companies that can charge you more because they know you're willing to pay, and many more.

These things already exist and happen, it's about the data getting better and not having to build tools to query it and make projections, since you can just type a query into a box even if you're not a data scientist.

gruez 5 hours ago | parent [-]

>I haven't implied an AI takeover, this data will be repackaged into a product for military/intelligence, political applications, insurance companies that can charge you more because they know you're willing to pay, and many more.

Any evidence google or meta actually sells customer data like that?