| ▲ | haswell 6 hours ago |
| > How is probing your browser for installed extensions not "scanning your computer"? I think most people would interpret “scanning your computer” as breaking out of the confines the browser and gathering information from the computer itself. If this was happening, the magnitude of the scandal would be hard to overstate. But this is not happening. What actually is happening is still a problem. But the hyperbole undermines what they’re trying to communicate and this is why I objected to the title. > They chose to put that particular extension in their target list, how is it not sinister? Alongside thousands of other extensions. If they were scanning for a dozen things and this was one of them, I’d tend to agree with you. But this sounds more like they enumerated known extension IDs for a large number of extensions because getting all installed extensions isn’t possible. If we step back for a moment and ask the question: “I’ve been tasked with building a unique fingerprint capability to combat (bots/scrapers/known bad actors, etc), how would I leverage installed extensions as part of that fingerprint?” What the article describes sounds like what many devs would land on given the browser APIs available. To reiterate, at no point am I saying this is good or acceptable. I think there’s a massive privacy problem in the tech industry that needs to be addressed. But the authors have chosen to frame this in language that is hyperbolic and alarmist, and in doing so I thing they’re making people focus on the wrong things and actually obscuring the severity of the problem, which is certainly not limited to LinkedIn. |
|
| ▲ | ryandrake 5 hours ago | parent | next [-] |
| > What the article describes sounds like what many devs would land on given the browser APIs available. > To reiterate, at no point am I saying this is good or acceptable. I think there’s a massive privacy problem in the tech industry that needs to be addressed. These two sentences highlight the underlying problem: Developers without an ethical backbone, or who are powerless to push back on unethical projects. What the article describes should not be "what many devs would land on" naturally. What many devs should land on is "scanning the user's browser in order to try to fingerprint him without consent is wrong and we cannot do it." To put it more extreme: If a developer's boss said "We need to build software for a drone that will autonomously fly around and kill infants," The developer's natural reaction should not be: "OK, interesting problem. First we'll need a source of map data, and vision algorithm that identifies infants...." Yet, our industry is full of this "OK, interesting technology!" attitude. Unfortunately, for every developer who is willing to draw the line on ethical grounds, there's another developer waiting in the recruiting pipeline more than willing to throw away "doing the right thing" if it lands him a six figure salary. |
| |
| ▲ | turtletontine 2 hours ago | parent | next [-] | | > These two sentences highlight the underlying problem: Developers without an ethical backbone, or who are powerless to push back on unethical projects. One reason your boss is eager to replace everyone with language models, they won’t have any “ethical backbone” :’) | | |
| ▲ | DrewADesign an hour ago | parent [-] | | Many developers overestimate their agency without extremely high labor demand. We got a say because replacing us was painful, not because of our ethics and wisdom. Without that leverage, developers are cogs just like every other part of the machine. |
| |
| ▲ | haswell 5 hours ago | parent | prev | next [-] | | I completely agree. Fighting against these kinds of directives was a large factor in my own major burnout and ultimately quitting big tech. I was successful for awhile, but it takes a serious toll if you’re an IC constantly fighting against directors and VPs just concerned about solving some perceived business problem regardless of the technical barriers. Part of the problem is that these projects often address a legitimate issue that has no “good” solution, and that makes pushing back/saying no very difficult if you don’t have enough standing within the company or aren’t willing to put your career on the line. I’d be willing to bet good money that this LinkedIn thing was framed as an anti-bot/anti-abuse initiative. And those are real issues. But too many people fail to consider the broader implications of the requested technical implementation. | |
| ▲ | jt2190 4 hours ago | parent | prev | next [-] | | > These two sentences highlight the underlying problem: Developers without an ethical backbone, or who are powerless to push back on unethical projects. What the article describes should not be "what many devs would land on" naturally. What many devs should land on is "scanning the user's browser in order to try to fingerprint him without consent is wrong and we cannot do it." I think using LinkedIn is pretty much agreeing to participate in “fingerprinting” (essentially identifying yourself) to that system. There might be a blurry line somewhere around “I was just visiting a page hosted on LinkedIn.com and was not myself browsing anyone else’s personal information”, but otherwise LinkedIn exists as a social network/credit bureau-type system. I’m not sure how we navigate this need to have our privacy while simultaneously needing to establish our priors to others, which requires sharing information about ourselves. The ethics here is not black and white. | |
| ▲ | jcgrillo 4 hours ago | parent | prev | next [-] | | You can't actually push back as an IC. Tech companies aren't structured that way. There's no employment protection of any kind, at least in the US. So the most you can do is protest and resign, or protest and be fired. Either way, it'll cost you your job. I've paid that price and it's steep. There's no viable "grassroots" solution to the problem, it needs to come from regulation. Managers need to serve time in prison, and companies need to be served meaningfully damaging fines. That's the only way anything will get done. | | |
| ▲ | philipallstar 37 minutes ago | parent | next [-] | | > There's no viable "grassroots" solution to the problem Does something like running the duckduckgo extension not help? | |
| ▲ | worik 3 minutes ago | parent | prev [-] | | > There's no viable "grassroots" solution to the problem, it needs to come from regulation. Managers need to serve time in prison, No, yes Yes, giving these people short (or long, mēh) prison sentences is the only thing that will stop this. No, the obvious grassroots response is to not use LinkedIn or Chrome. (You mean developers not consumers, I think. The developers in the trenches should obey if they need their jobs, they are not to blame. It is the evil swine getting the big money and writing the big cheque's...) |
| |
| ▲ | mrguyorama 3 hours ago | parent | prev | next [-] | | I integrate these kinds of systems in order to prevent criminals from being able to use our ecommerce platform to utilize stolen credit cards. That involves integrating with tracking providers to best recognize whether a purchase is being made by a bot or not, whether it matches "Normal" signals for that kind of order, and importantly, whether the credit card is being used by the normal tracking identity that uses it. Even the GDPR gives us enormous leeway to do literally this, but it requires participating in tracking networks that have what amounts to a total knowledge of purchases and browsing you do on the internet. That's the only way they work at all. And they work very well. Is it Ethical? It is a huge portion of the reason why ecommerce is possible, and significantly reduces credit card fraud, and in our specific case, drastically limits the ability of a criminal to profit off of stolen credit cards. Are people better off from my work? If you do not visit our platforms, you are not tracked by us specifically, but the providers we work with are tracking you all over the web, and definitely not just on ecommerce. Should this be allowed? | | |
| ▲ | benregenspan 2 hours ago | parent | next [-] | | What I'm wondering is if this requires sending the full list of extensions straight to a server (as opposed to a more privacy-protecting approach like generating some type of hash clientside)? Based on their privacy policy, it looks like Sift (major anti-fraud vendor) collects only "number of plugins" and "plugins hash". No one can accuse them of collecting the plugins for some dual-use purpose beyond fingerprinting, but LinkedIn has opened themselves up to this based on the specific implementation details described. | | |
| ▲ | mrguyorama 44 minutes ago | parent [-] | | The SOP of this entire industry is "Include this javascript link in your tag manager of choice", and it will run whatever javascript it can to collect whatever they want to collect. You then integrate in the back end to investigate the signals they sell you. America has no GDPR or similar law, so your "privacy" never enters the picture. They do not even think about it. This includes things like the motion of your mouse pointer, typing events including dwell times, fingerprints. If our providers are scanning the list of extensions you have installed, they aren't sharing that with us. That seems overkill IMO for what they are selling, but their business is spyware so... On the backend, we generally get the results and some signals. We do not get the massive pack of data they have collected on you. That is the tracking company's prime asset. They sell you conclusions using that data, though most sell you vague signals and you get to make your own conclusions. Frankly, most of these providers work extremely well. Sometimes, one of our tracking vendors gets default blackholed by Firefox's anti-tracking policy. I don't know how they manage to "Fix" that but sometimes they do. Again, to make that clear, I don't care what you think Firefox's incentives are, they objectively are doing things that reduce how tracked you are, and making it harder for these companies to operate and sell their services. Use Firefox. In terms of "Is there a way to do this while preserving privacy?", it requires very strict regulation about who is allowed to collect what. Lots of data should be collected and forwarded to the payment network, who would have sole legal right to collect and use such data, and would be strictly regulated in how they can use such data, and the way payment networks handle fraud might change. That's the only way to maintain strong credit card fraud prevention in ecommerce, privacy, status quo of use for customers, and generally easy to use ecommerce. It would have the added benefit of essentially banning Google's tracking. It would ban "Fraud prevention as a service" though, except as sold by payment networks. Is this good? I don't know. |
| |
| ▲ | michaelt 2 hours ago | parent | prev [-] | | > Even the GDPR gives us enormous leeway to do literally this, but it requires participating in tracking networks that have what amounts to a total knowledge of purchases and browsing you do on the internet. That's the only way they work at all. That data sounds like it would be very valuable. But I think if I sell widgets and a prospective customer browsers my site, telling my competitors (via a data broker) that customer is in the market for widgets is not a smart move. How do such tracking networks get the cooperation of retailers, when it’s against the retailers interests to have their customers tracked? | | |
| ▲ | kevin_thibedeau 2 hours ago | parent [-] | | They get demographic data on their customers and can use that for marketing and setting prices. |
|
| |
| ▲ | orochimaaru an hour ago | parent | prev [-] | | One works for money. And money is important. Ethics isn’t going pay mortgage, send kids to university and all that other stuff. I’m not going to do things that are obviously illegal. But if I get a requirement that needs to be met and then the company legal team is responsible for the outcome. In short, you are not going to solve this problem blaming developer ethics. You need regulation. To get the right regulation we need to get rid of PACs and lobbying. | | |
| ▲ | IG_Semmelweiss 21 minutes ago | parent [-] | | You are transfering moral agency from yourself, to the government. Will you do the same for your kids ? WOuld you let the government decide for you whats right, and what's wrong ? | | |
| ▲ | ryandrake 13 minutes ago | parent [-] | | Regulation does not necessarily need to be about deciding what's right and what's wrong. It's about making life better for people. That's supposed to be why we have government. If they are not improving people's lives, why do we even have them? Too many people see the government doing nothing to improve their lives and think there's totally nothing wrong with that. |
|
|
|
|
| ▲ | emacdona 6 hours ago | parent | prev | next [-] |
| > I think most people would interpret “scanning your computer” as breaking out of the confines the browser and gathering information from the computer itself. That is exactly how I interpreted it, and that is why I clicked the link. When I skimmed the article and realized that wasn't the case, I immediately thought "Ugh, clickbait" and came to the HN comments section. > To reiterate, at no point am I saying this is good or acceptable. I think there’s a massive privacy problem in the tech industry that needs to be addressed. 100% Agree. So, in summary: what they are doing is awful. Yes, they are collecting a ton of data about you. But, when you post with a headline that makes me think they are scouring my hard drive for data about me... and I realize that's not the case... your credibility suffers. Also, I think the article would be better served by pointing out that LinkedIn is BY FAR not the only company doing this... |
| |
| ▲ | smohare an hour ago | parent | next [-] | | [dead] | |
| ▲ | lejalv 5 hours ago | parent | prev [-] | | But LinkedIn is the one social network many people literally cannot escape to put food on the table. I don't care about how much spying is going on in ESPN. I can ditch it at the shadow of a suspicion. Not so with LinkedIn. This is very alarming, and pretending it's not because everyone else does it sounds disingenuous to me. | | |
| ▲ | umanwizard 40 minutes ago | parent | next [-] | | You can also just browse LinkedIn with a browser that doesn’t have extensions installed, if privacy is that important to you. Like everyone else on this thread, I’m not condoning it or saying it’s a good thing, but this post is an exaggeration. | |
| ▲ | franktankbank 4 hours ago | parent | prev [-] | | That sounds problematic and is only supported by people mindlessly agreeing to it. I know someone who got jobs at google and apple with no linkedin, and he wasn't particularly young. What do you do in the face of it? I say quit entirely. It was an easy decision because I got nothing out of it during the entire time I was on it. |
|
|
|
| ▲ | nightpool 5 hours ago | parent | prev | next [-] |
| > I think most people would interpret “scanning your computer” as breaking out of the confines the browser and gathering information from the computer itself. Yes, but I also think that most people would interpret "Getting a full list of all the Chrome extensions you have installed" as a meaningful escape/violation of the browser's privacy sandbox. The fact that there's no getAllExtensions API is deliberate. The fact that you can work around this with scanning for extension IDs is not something most people know about, and the Chrome developers patched it when it became common. So I don't think describing it as something everybody would expect is totally fine and normal for browsers to allow is correct. |
| |
| ▲ | crazygringo an hour ago | parent | next [-] | | > Yes, but I also think that most people would interpret "Getting a full list of all the Chrome extensions you have installed" as a meaningful escape/violation of the browser's privacy sandbox. I don't think so, because most people understand that extensions necessarily work inside of the sandbox. Accessing your filesystem is a meaningful escape. Accessing extensions means they have identification mechanisms unfortunately exposed inside the sandbox. No escape needed. It's extremely unfortunate that the sandbox exposes this in some way. Microsoft should be sued, but browsers should also figure out how to mitigate revealing installed extensions. | |
| ▲ | haswell 5 hours ago | parent | prev [-] | | > I also think that most people would interpret "Getting a full list of all the Chrome extensions you have installed" as a meaningful escape/violation of the browser's privacy sandbox I think that’s a far more reasonable framing of the issue. > I don't think describing it as something everybody would expect is totally fine and normal for browsers to allow is correct. I agree that most people would not expect their extensions to be visible. I agree that browsers shouldn’t allow this. I, and most privacy/security focused people I know have been sounding the alarm about Chrome itself as unsafe if you care about privacy for awhile now. This is still a drastically different thing than what the title implies. |
|
|
| ▲ | ksymph 5 hours ago | parent | prev | next [-] |
| > Alongside thousands of other extensions. If they were scanning for a dozen things and this was one of them, I’d tend to agree with you. But this sounds more like they enumerated known extension IDs for a large number of extensions because getting all installed extensions isn’t possible. To take a step back further: what you're saying here is that gathering more data makes it less sinister. The gathering not being targeted is not an excuse for gathering the data in the first place. It's likely that the 'naive developer tasked with fingerprinting' scenario is close to the reality of how this happened. But that doesn't change the fact that sensitive data -- associated with real identities -- is now in the hands of MS and a slew of other companies, likely illegally. > But the authors have chosen to frame this in language that is hyperbolic and alarmist, and in doing so I thing they’re making people focus on the wrong things and actually obscuring the severity of the problem, which is certainly not limited to LinkedIn. The article is not hyperbolizing by exploring the ramifications of this; and it's true that this sort of tracking is going on everywhere, but neither is it alarmist to draw attention to a particularly egregious case. What wrong things does it focus on? |
| |
| ▲ | haswell 3 hours ago | parent [-] | | > The gathering not being targeted is not an excuse for gathering the data in the first place. I’m not saying it is. My point is that they appear to be trying to accomplish something like getInstalledExcentions(), which is meaningfully different from a small and targeted list like isInstalled([“Indeed.com”, “DailyBibleVerse”, “ADHD Helper”]). One could be reasonably interpreted as targeting specific kinds of users. What they’re actually doing to your point looks more like a naive implementation of a fingerprinting strategy that uses installed extensions as one set of indicators. Both are problematic. I’m not arguing in favor of invasive fingerprinting. But what one might infer about the intent of one vs. the other is quite different, and I think that matters. Here are two paragraphs that illustrate my point: > “Microsoft reduces malicious traffic to their websites by employing an anti-bot/anti-abuse system that builds a browser fingerprint consisting of <n> categories of identifiers, including Browser/OS version, installed fonts, screen resolution, installed extensions, etc. and using that fingerprint to ban known offenders. While this approach is effective, it raises major privacy concerns due to the amount of information collected during the fingerprinting process and the risk that this data could be misused to profile users”. vs. > “Microsoft secretly scans every user’s computer software to determine if they’re a Christian or Muslim, have learning disabilities, are looking for jobs, are working for a competitor, etc.” The second paragraph is what the article is effectively communicating, when in reality the first paragraph is almost certainly closer to the truth. The implications inherent to the first paragraph are still critical and a discussion should be had about them. Collecting that much data is still a major privacy issue and makes it possible for bad things to happen. But I would maintain that it is hyperbole and alarmism to present the information in the form of the second paragraph. And by calling this alarmism I’m not saying there isn’t a valid alarm to raise. But it’s important not to pull the fire alarm when there’s a tornado inbound. | | |
| ▲ | eipi10_hn 2 hours ago | parent [-] | | Calling out the fingerprinting users' extensions is not hyperbolic. Defending that action is. | | |
| ▲ | haswell an hour ago | parent [-] | | Calling out the fingerprinting of extensions is appropriate and can be achieved without hyperbole. As I’ve stated clearly throughout this thread, the fingerprinting they’re doing is a problem. Calling it “searching your computer” is also a problem. > Defending that action is Nowhere have I defended what LinkedIn is doing. |
|
|
|
|
| ▲ | Kuraj 2 hours ago | parent | prev | next [-] |
| > I think most people would interpret “scanning your computer” as breaking out of the confines the browser and gathering information from the computer itself. If this was happening, the magnitude of the scandal would be hard to overstate. But at the end of the day, the browser is likely where your most sensitive data is. |
|
| ▲ | globular-toast an hour ago | parent | prev | next [-] |
| > I think most people would interpret “scanning your computer” as breaking out of the confines the browser and gathering information from the computer itself. Which they would, if they could. They are scanning users' computers to the maximum extent possible. |
|
| ▲ | lejalv 5 hours ago | parent | prev | next [-] |
| > making people focus on the wrong things and actually obscuring the severity of the problem, which is certainly not limited to LinkedIn. No, LinkedIN has much more sensitive data already. Combined with which the voracious fingerprinting, this stands out as a particularly dystopian instance of surveillance capitalism |
|
| ▲ | franktankbank 4 hours ago | parent | prev [-] |
| > Alongside thousands of other extensions. If they were scanning for a dozen things and this was one of them, I’d tend to agree with you. But this sounds more like they enumerated known extension IDs for a large number of extensions because getting all installed extensions isn’t possible. If that's all it takes to fool you then its pretty trivial way to hide your true intentions. |