| ▲ | andersonpico 7 hours ago |
| this is a massive violation of trust > The scan doesn’t just look for LinkedIn-related tools. It identifies whether you use an Islamic content filter (PordaAI — “Blur Haram objects, real-time AI for Islamic values”), whether you’ve installed an anti-Zionist political tagger (Anti-Zionist Tag), or a tool designed for neurodivergent users (simplify). |
|
| ▲ | Aurornis 6 hours ago | parent | next [-] |
| Many extensions designed to scrape data from social media websites are disguised as simple extensions that do something else. If I had to guess: I sought that automatic content blurrer, neurodivergent website simplifier, or anti-Zionist tagger actually work. They’re all just piggybacking on trending topics to get users to install them and then forget about them, then they exfiltrate the data when you visit LinkedIn. |
| |
| ▲ | cryptoegorophy 5 hours ago | parent [-] | | This. Do not install any extension unless you absolutely need. Assume they all leak your browsing data.
Not familiar with Google but if you can just vibe code your own extension then do that. | | |
| ▲ | cousin_it 3 hours ago | parent | next [-] | | Vibe supply chain attacks are coming btw. | | |
| ▲ | johanyc 2 hours ago | parent [-] | | Wdym? You vibe code your software. Are you saying the LLM will spit out malware? | | |
| ▲ | GrinningFool 2 hours ago | parent [-] | | Sooner or later, yes. What stops it , other than layers of imperfect process? And it's the perfect vector to exploit anyone who doesn't review and understand the generated code before running it locally |
|
| |
| ▲ | dcchuck 3 hours ago | parent | prev [-] | | They're also the only avenue to breaking out of the browser sandbox. |
|
|
|
| ▲ | crazygringo an hour ago | parent | prev | next [-] |
| It's for fingerprinting and possibly ad targeting. It's no different from when you visit an Islamist or anti-Zionist website that has analytics/trackers/ads on it. It's bad, but this "massive violation of trust" is happening everywhere and has been for decades. There's nothing that's unique to Microsoft here. |
|
| ▲ | gwerbin 6 hours ago | parent | prev | next [-] |
| Almost certainly they are using that for audience segmentation and ad targeting. Clever and disgusting. This isn't the invention of some evil moustache-twirling executive, this was the invention of an employee or group of employees who value money more than morals. We should think of such employees as henchmen. |
| |
| ▲ | luxuryballs 5 hours ago | parent [-] | | if they do a better job at showing me an ad that might be relevant to me, how is that disgusting? if I have to see an ad at all I at least want them to give it their best shot | | |
| ▲ | alt227 4 hours ago | parent | next [-] | | I cant believe that people still have the attitude that the trillions of dollars being invested in all this technology and tracking is just to give them a more relevant ad. Do people really not remember scandals like Cambridge Analytica, and realise that these ads combined with social media feeds can be used to literally control and manipulate peoples decisions and behavoir? Theres a reason Facebook and Youtube just got sued for being intentionally addictive attention machines. | | |
| ▲ | caminante 3 hours ago | parent | next [-] | | You're glossing over the nuance of the Cambridge Analytica scandal or at least I don't see how it's connected here. Facebook was a party, but not the protagonist. - a Cambridge researcher (Aleks Kogan) created a personality quiz FB app advertised as academic research - users had to consent to download the app - the app nefariously scraped users' friends' data (300k users unlocked 87 million users' data) - the information was sold to Cambridge Analytica - who then used the information to profile American voters LinkedIn already has all of this information from the information you feed it. Scanning for more information provides more refined views, but LinkedIn already has your graph. | | |
| ▲ | alt227 3 hours ago | parent [-] | | The parent post said: > if they do a better job at showing me an ad that might be relevant to me, how is that disgusting? To me that signalled that the author of the comment doesnt really care what is gonig on behind the scenes if the result is a better and more relevant ad. I see this attitude often from people who dont seem to understand the severity and seriousness of online tracking which leads to psychological profiling which leads to manipulation. > who then used the information to profile American voters You seem to have missed off the most serious bit at the end.
Cambridge Analytica then used the data to profile millions of voters, and purposefully target divisive and flammable political material to specific suggestible people in order to manipulate outcomes. This same thing is done all the time by all tracking and ad companies. I think this thread has gone beyond just LinkdIn scanning your browser extensions. | | |
| ▲ | caminante 2 hours ago | parent [-] | | I agree that it could come off as gross negligence to not care about what happens with your data. My point is that LinkedIn already has enough information (We've willingly given them!) to manipulate outcomes and if they're doing something nefarious, then it's already too late. Whereas Cambridge Analytica involved bad actors (not Facebook) duping customers and re-selling their data. I don't think those elements are necessarily in play here. |
|
| |
| ▲ | luxuryballs 2 hours ago | parent | prev [-] | | is the manipulation of decisions and behavior not just a way of saying sales and marketing? I agree that it def can be used for bad things, but so can most tools/systems |
| |
| ▲ | GrinningFool 2 hours ago | parent | prev | next [-] | | The rules say we should default to assuming good faith in comments. But it's hard when I see this comment in 2026. | |
| ▲ | gwerbin 5 hours ago | parent | prev | next [-] | | Imagine if someone was following you around with a clipboard writing down everything you do, then rifling through your bookshelf to make note of certain books on the bookshelf, and then using that to target ads at you. You'd say that's a ridiculous and illegal thing to do without you explicit consent, right? Maybe you personally don't mind and would be happy to offer that consent. But they're doing it without your consent, regardless of whether you want it or not. | |
| ▲ | buellerbueller 3 hours ago | parent | prev | next [-] | | It's not just about ads. The same data and tech is also about locking you up and identifying you for deportation you if this admin thinks you are in the USA without permission. | | |
| ▲ | gwerbin an hour ago | parent [-] | | And laundering responsibility. If the government uses a contractor to identify deportation candidates using this data, and they get it wrong, the government can at least try to shrug it off and blame the contractor, whose job is in part to absorb public outrage for these sorts of things. Whereas if the FBI wiretaps you and still gets it wrong, it's a lot harder to deflect blame. |
| |
| ▲ | franktankbank 4 hours ago | parent | prev [-] | | What if someone makes an ad thats not an ad at all, maybe its a rabbithole designed to fuck with you. Maybe its designed to enrage you. |
|
|
|
| ▲ | egorfine 6 hours ago | parent | prev | next [-] |
| > this is a massive violation of trust This is not. To violate trust, there should have been some. |
| |
| ▲ | chii 6 hours ago | parent [-] | | There's an implicit trust that a site doesn't try to racially profile you, as it is illegal. There's no enforcement, but that's why trust is being violated. | | |
| ▲ | hedora 5 hours ago | parent [-] | | It's probably not illegal for advertisers to racially profile you, but it certainly is illegal in the US to do those things as part of your hiring process: https://www.eeoc.gov/prohibited-employment-policiespractices LinkedIn's scanning for browser extensions used by protected groups allows them to provide illegal services to US-based recruiters. I have no idea if they actually do it or not, and am not a lawyer, but common sense suggests there's enough here for a class action suit to move into discovery. |
|
|
|
| ▲ | einpoklum 6 hours ago | parent | prev | next [-] |
| If you mean by the website, then - surely not. What basis do you have to trust websites you visit? Especially a social network that owned by Microsoft to boot? If you mean the _browser_, then I agree in principle, but - it is a browser offered to you by Alphabet. And they are known to mass surveillance and use of personal information for all sorts of purposes, including passing copies to the US intelligence agencies. But of course, this is what's promoted and suggested to people and installed by default on their phones, so even if it's Google/Alphabet, they should be pressured/coerced into respecting your privacy. |
|
| ▲ | bethekidyouwant 6 hours ago | parent | prev | next [-] |
| It scans thousands so in thousands, some of them have these weird names |
|
| ▲ | cbeach 5 hours ago | parent | prev [-] |
| [flagged] |
| |