| ▲ | ch4s3 3 hours ago |
| IMO we should ban gathering this data without a warrant or specific contractual agreement between the device owner and entity aggregating the data. As much as congress loves to claim the interstate commerce theory of everything, this seems like a slam dunk. |
|
| ▲ | Dwedit 3 hours ago | parent | next [-] |
| Contractual agreement? Nobody reads things like EULAs or terms of service. It's probably in there already. |
| |
| ▲ | ch4s3 3 hours ago | parent | next [-] | | I should have been a bit more clear. We should ban retention for any purposes where it is not explicitly required for the intended function and clearly agreed to by all parties. Think somethig like strava or asset tracking. You know it stores gps data, and why. | | |
| ▲ | ryandrake 3 hours ago | parent [-] | | There is no such things as "clearly agreed to by all parties" when it comes to end users. Companies provide a one-sided, "take it or leave it" EULA, and if you don't agree to everything in it, you don't use the product. There is no meeting of the minds, there is no negotiation, and there is no actual agreement. It's a rule book dictated by one side. | | |
| ▲ | pocksuppet 3 hours ago | parent | next [-] | | Then it's not a valid contract and therefore does not absolve them of criminal liability for stalking you. | | |
| ▲ | kube-system 2 hours ago | parent | next [-] | | Contracts of adhesion can be valid contracts. The ability to negotiate or equal bargaining power is not a required element of a contract. Furthermore, you cannot contract away criminal liability if any exists. | | |
| ▲ | lukeschlather 2 hours ago | parent [-] | | Even attempting to use a contract of adhesion to justify selling GPS location data to a third party should be a criminal act. | | |
| |
| ▲ | celeritascelery an hour ago | parent | prev [-] | | You click on “accept terms and conditions” which means you agree to the contact. |
| |
| ▲ | ch4s3 3 hours ago | parent | prev | next [-] | | You can't just bury literally anything in an EULA. There's a fair amount of case law establishing that EULAs clauses that are surprising or illegal aren't enforceable. | | |
| ▲ | pwg 3 hours ago | parent [-] | | That fact does not change the point of the individual to which you replied. Regardless of whether the clauses in the EULA are 100% legal, some mixture or 100% illegal, the entire EULA is a "one sided rule-book dictated completely by one side". You, the person held to the EULA's rules, do not get to negotiate on the individual points. You simply have a "take it or go away" set of options. | | |
| ▲ | nickburns 2 hours ago | parent | next [-] | | https://en.wikipedia.org/wiki/Shrinkwrap_(contract_law) | |
| ▲ | kube-system an hour ago | parent | prev | next [-] | | You're talking about contracts of adhesion and they are overwhelmingly common for B2C agreements. Most red-lining of contracts only happens in high-value B2B transactions where the sums of money involved are enough that it makes sense to bring lawyers into the loop. | |
| ▲ | rolph 2 hours ago | parent | prev [-] | | when you already pay for the device and a contract, then surprise now that you have skin and flesh in the game, you HAVE TO agree to this EULA or your property is a brick and we keep your money. that is defined as extortion, but labled as onboarding. | | |
| ▲ | kube-system an hour ago | parent [-] | | Courts do look poorly upon this -- to have a valid contract of adhesion there is some degree of advanced notice required and ability to reject it. |
|
|
| |
| ▲ | stavros 2 hours ago | parent | prev [-] | | There is the GDPR. |
|
| |
| ▲ | teeray an hour ago | parent | prev | next [-] | | Instead of “I accept”, you’re given a quiz | |
| ▲ | toofy 3 hours ago | parent | prev | next [-] | | if it were up to me i’d require a hand signed contract that explicitly, up front and in plain english gives permission and is not transferable to any “partners”. | |
| ▲ | rubyfan 3 hours ago | parent | prev [-] | | Right, privacy terms are written to be vague and permissive. Even if you read them you can’t usually understand how the data will be used or opt out. |
|
|
| ▲ | wakawaka28 2 minutes ago | parent | prev | next [-] |
| Every EULA already covers this basically. The real problems are: people agree to it, and the government can do an end-run around the constitution by simply purchasing data or hiring contractors. |
|
| ▲ | rubyfan 3 hours ago | parent | prev | next [-] |
| I think we should make this type of tracking opt-out by default. We should also ban the sale of its use to third parties and its use for purposes other than the specific functionality which required it to be enabled in the first place. |
| |
|
| ▲ | troupo 3 hours ago | parent | prev [-] |
| > IMO we should ban gathering this data without GDPR tried. And the narrative around GDPR was deliberately completely derailed by adtech. Lack of enforcement didn't help either |
| |
| ▲ | ch4s3 3 hours ago | parent [-] | | GDPR like all EU regulation is needlessly complicated and aimed at a compliance model that seems designed for SAP. | | |
| ▲ | microtonal 2 hours ago | parent | next [-] | | The compliance model is very simple. Do not collect data. Problem solved. If you need to collect data (e.g. because you are a webshop), only collect the minimum necessary. The problem is not the GDPR, the problem is the surveillance industry that wants to grab as much data as possible and try to do as much malicious compliance as possible. | | |
| ▲ | jandrewrogers 2 hours ago | parent [-] | | Designing around GDPR compliance shows up all over the place in industrial data collection. It doesn't only affect surveillance webslop. The costs are often worse on industrial side because the data is so much larger and faster than web or mobile data. | | |
| ▲ | gwerbin an hour ago | parent [-] | | What do you mean by "industrial" in this case? | | |
| ▲ | jandrewrogers 37 minutes ago | parent [-] | | Telemetry from machines and data from environmental sensors that is collected for operational purposes (safety, efficiency, reliability) in industrial applications. Old school engineering systems that in modern times have expansive network-connected sensors that may even have onboard classifiers to reduce the quantity of data. The trouble started when lawyers correctly noticed that these are incidentally capable surveillance systems even though that isn't how we use them or what they were designed for. |
|
|
| |
| ▲ | pocksuppet 3 hours ago | parent | prev | next [-] | | Have you read it? It's not that bad, unless you're thinking like an adtech programmer trying to find the exact edge case for the maximal amount of tracking you're allowed to do, because such a bright line does not exist and that fact infuriates adtech professionals. It is vague because reality is vague and complex; each specific case of alleged violation has to be interpreted by multiple humans; there is no algorithm. | | |
| ▲ | ch4s3 3 hours ago | parent | next [-] | | The law mandates a data protection officer with specific duties. It also establishes a board that "issue guidelines, recommendations, and best practices" which is where administrative complication and nonsense always creeps in. | |
| ▲ | jandrewrogers 2 hours ago | parent | prev [-] | | It is regulation that imagines companies are a government bureaucracy. I have read GDPR and don't work in adtech. It is vague and it is pretty easy to find pathological scenarios that don't make much sense or impose an unusually high burden for no benefit. Every European law firm seems to agree with this assessment despite what proponents assert. Consequently, it forces a lot of expensive defensive activity in practice. To some extent, it was just a failure of imagination on the part of GDPR's authors. Many things are not nearly as simple as it seems to assume and it bleeds into data models that have nothing to do with people. It is what it is but no one should pretend it is not a burden for companies that have nothing to do with adtech or even data about people. |
| |
| ▲ | troupo 3 hours ago | parent | prev [-] | | You can literally read the entire "complicated" regulation in one sitting in an afternoon. There's literally nothing complex or complicated about it. Congrats on gullibly believing the ad tech narrative. | | |
| ▲ | ryandrake 3 hours ago | parent | next [-] | | The "GDPR is complicated" meme has been circulating among software developers since probably before it was even written. It's so wild that HN dunks on it so much: Here we have a societal problem in computing we've been complaining about for decades, someone offers an incremental but imperfect regulation to start taking steps to correct it, and everyone hates it! | | |
| ▲ | pocksuppet 3 hours ago | parent [-] | | Same with the California age input box. | | |
| ▲ | lukeschlather an hour ago | parent [-] | | The problem with the age input box is that we don't have the GDPR. We're mandating that people give accurate age information to advertisers, and it's legal for advertisers to sell detailed dossiers on people including their age and target advertising using the age. This is why Meta wrote the age input box legislation, they want to make everyone legally required to provide Meta with their age. |
|
| |
| ▲ | ch4s3 3 hours ago | parent | prev [-] | | Being able to read something in one sitting doesn't make it simple or obvious. The law establishes a board that gets to set new requirements. | | |
| ▲ | stavros 2 hours ago | parent [-] | | As someone who has to implement it, it's really not bad at all: Ask the user for consent to use their data, and don't be misleading about it. That's it. The rest of the "It'S So LaRgE AnD UndErSpEciFieD" is just FUD. The regulators don't just slap fines, they work with you to get you to comply, and they just want to see that you're putting in the effort instead of messing them about. I have literally never been surprised by the GDPR. Whenever I thought "surely this is allowed" it was, whenever I thought "this can't be allowed", it wasn't. For everything in the middle, nobody will punish you for an honest mistake. | | |
| ▲ | ch4s3 16 minutes ago | parent | next [-] | | > for everything in the middle, nobody will punish you for an honest mistake. How do you know that? Again the law establishes a rules making body that can at any time change or add rules, and as far as I can tell there's no public review process. | |
| ▲ | redwall_hp 2 hours ago | parent | prev [-] | | Anti GDPR people: "it's so complicated not being able to walk into someone's house and take their things! Which things can I not take? How about this? And now I need a lawyer if I take someone's things? Ridiculous!" Just don't spy on people. | | |
| ▲ | stavros 2 hours ago | parent [-] | | Yeah that's pretty much what it feels like, or sometimes it's "what if someone's stuff is lying on the street? Can I take it then?" and the regulator is kind of like "look around and ask if it belongs to anyone, and if not, sure". |
|
|
|
|
|
|