| ▲ | bmitch3020 2 days ago |
| I don't want to stop Flock the company. I want to stop Flock the business model, along with all the other mass surveillance, and the data brokers. If the business models can't be made illegal, it should at least come with liabilities so high that no sane business would want to hold data that is essentially toxic waste. Without that, we are quickly spiraling into the dystopia where privacy is gone, and when the wrong person gets access to the data, entire populations are threatened. |
|
| ▲ | stevemk14ebr 2 days ago | parent | next [-] |
| You want to stop the source, which is that the government and other agencies can purchase surveillance data that would otherwise be disallowed by the 4th amendment. We need to end this 'laundering' of information through third parties, and enforce the constitution by its intent. |
| |
| ▲ | RHSeeger 2 days ago | parent | next [-] | | Not just the government. It shouldn't be possible for any random stalker to find someone's daily movements. | | |
| ▲ | nullc 2 days ago | parent | next [-] | | They're also one and the same generally-- at least if the stalker has money or the right friends most kinds of law enforcement access means stalker access. It's not unheard of for an officer themselves to be the stalker, and there are so many people that work in law enforcement that bribing, impersonating, or persuading your way to access is not that big a deal. Not to mention that enabled stalkers can just file a federal lawsuit and issue subpoena for records. The only safe thing is for the records to never exist in the first place. | | | |
| ▲ | jojobas 2 days ago | parent | prev | next [-] | | How is that achievable? PIs can legally do it. Random people can keep tabs on you and exchange gossip. It's the sudden scale and low cost that doesn't sit well with freedom to not be tracked in public 24/7 we took for granted. | | |
| ▲ | ethbr1 2 days ago | parent | next [-] | | > How is that achievable? The core ill is aggregated data, because that's what allows the mass in surveillance, data mining, etc. The collection actions are almost immaterial. Without persistence they must be re-performed for each request, which naturally provides a throughput bottleneck and makes "for everyone" untenable. If we agree the aggregated data at rest is the problem, then addressing it would look like this: 1. Classify all data holders at scale into a regulated group 2. Apply initial regulations - To respond to queries for copies of personal data held
- To update data or be liable in court for failing to do so
- To validate counterparties apply basic security due diligence before transferring data (or the transferer also faces liability)
- To maintain a *full* chain of custody of data (from originator through every intermediate party to holder) so that leaks / misuse can be traced
- To file yearly update on the types, amount of data, and counterparties it was transferred to with the federal government that are made public
The initial impediment to regulatory action is Google, Meta, Equifax, etc. saying "This problem is too complex and you don't understand it."It's not. But the first step is classifying and documenting the problem. | |
| ▲ | RHSeeger 2 days ago | parent | prev | next [-] | | Sorry, I was ambiguous in what I meant. It is not realistic to say that no person is allowed to keep track of another person; watch where they go, when, with who, etc. It should not be acceptable for a company to gather information on "everyone"; where they have been going, when, with who, how often, etc. And it should not be acceptable for them to sell that information (to government agencies OR private citizens). It's a matter of scale. - Making the first one illegal/impossible would be difficult/costly; and not doing so has a limited impact (to society, not to the single person affected). - Making the second one illegal is much easier, and it's much easier to shut down a large company doing it than it is 1,000 individual stalkers. The impact of making it illegal is much wider and better for society as a whole. We don't want anyone being stalked. But in a cost/benefit analysis, we can do something about one of them but not the other. | |
| ▲ | rdevilla 2 days ago | parent | prev | next [-] | | It's not achievable. The only way is through - everybody should get into the practice of stalking and gossiping about each other in a Molochian environment, where the people who do not do so suffer from the losing side of an information asymmetry. Expect AI, especially post-Mythos, to just enable this at even further scale. Consumer grade wireless networking gear as a whole is a very wide attack surface and is basically never updated. | |
| ▲ | buzer 2 days ago | parent | prev [-] | | If PIs can "legally" do it then it sounds like there is a law which allows them to do it. That law can be revoked (unless the power comes from Constitution which would make it effectively impossible to revoke). Note that PIs are effectively illegal under GDPR by default. They would generally need to provide Article 13 notice, i.e. you would become aware of them unless they were just asking around without actually following you. Member states can make them legal though (via Article 23) and likely in many cases they have done so. | | |
| ▲ | jojobas 2 days ago | parent [-] | | In the US, PI licensing is only about PIing for hire. The actual act of going through public records, following cars and whatnot do not require a license, you can spy on anyone without a license as long as you don't get paid for it. EU is more complicated, but Article 14.5.b allows withholding notice if it would impair/defeat the purpose of processing. The PI must however apply "safeguards", whatever it could mean. | | |
| ▲ | fc417fc802 2 days ago | parent | next [-] | | > following cars and whatnot do not require a license, you can spy on anyone without a license as long as you don't get paid for it. Pretty sure that would be considered stalking and is broadly illegal in the US, PIs being an exception. | |
| ▲ | buzer 2 days ago | parent | prev [-] | | Article 14(5)(b) does, but that only applies for Article 14 notice (personal data not directly obtained from data subject). Article 13 (personal data obtained directly from data subject) does not have such exception in GDPR itself. This becomes extremely relevant when you read it in the light of the C-422/24 decision. In that personal data collected via body worn cameras was determined to be "directly obtained". Paragraph 41 from the judgement: > If it were accepted that Article 14 of the GDPR applies where personal data are collected by means of a body camera, the data subject would not receive any information at the time of collection, even though he or she is the source of those data, which would allow the controller not to provide information to that data subject immediately. Therefore, such an interpretation would carry the risk of the collection of personal data escaping the knowledge of the data subject and giving rise to hidden surveillance practices. Such a consequence would be incompatible with the objective, referred to in the preceding paragraph, of ensuring a high level of protection of the fundamental rights and freedoms of natural persons. Given this it's very unlikely that PI observing (especially if they record) could be considered to be Article 14 instead of Article 13 type of collection as it's exactly "hidden surveillance practice" that the Court warned about. Member states do have a right to restrict the Article 13 disclosure obligations via Article 23 restriction, but that requires specific law in the member state & the law itself must fulfill the obligations that Article 23 requires. Article 23(2) essentially forbids leaving everything up to the controller. And as far as PI in the US goes, actions between stalking and PI "for self" tend to be so similar that I wouldn't necessarily recommend anyone to try it. |
|
|
| |
| ▲ | NoSalt 2 days ago | parent | prev [-] | | Government ... random stalker ... same thing. |
| |
| ▲ | therobots927 2 days ago | parent | prev | next [-] | | Means of Control by Byron Tau and Surveillance Valley by Yasha Levine. Can’t recommend these books enough for anyone who is skeptical of the above claim. | |
| ▲ | caconym_ 2 days ago | parent | prev | next [-] | | Honestly it should probably just be illegal for anyone, private or public, to engage in mass surveillance (or "data gathering", whatever) of anybody who didn't expressly consent to it. As long as the data exist, they will be abused. | | |
| ▲ | nandomrumber 2 days ago | parent | next [-] | | But I did expressly consent. When I installed the SoundCloud app and it told me by continuing I agree to them sharing my data with their 954 partners.[1] 1. I’m not even joining. When I mostly recently installed the SoundCloud app - for the first time on a new device, that’s what’s it said: 954 partners. How can anyone reasonably understand what it is their agreeing to in that scenario. | | |
| ▲ | Perseids 2 days ago | parent [-] | | This is the important point. You need the right to not be discriminated when you withhold your consent, otherwise your consent is effectively meaningless, as it is forced on you by your impossible bargaining position. This is one of the central pillars of the GDPR without which it wouldn't work at all. Be advised to make asking customers for consent that doesn't directly benefit them illegal as well, lest you risk creating another wave of malicious cookie banners. | | |
| ▲ | Joker_vD 2 days ago | parent [-] | | > You need the right to not be discriminated when you withhold your consent, otherwise your consent is effectively meaningless, as it is forced on you by your impossible bargaining position. Which is why "we don't serve patrons without shoes and pants" policy is unconstitutional, yeah. If you don't want to agree to a business's demands — you're welcome to not deal with them and look for an alternative. All the alternatives have the same (or even worse) demands? Unless you can prove collusion, that's just how the invisible hand of the market worked its magic out. Go petition you congressman to violate laissez-faire even more than it already is, I guess. | | |
| ▲ | LadyCailin 2 days ago | parent | next [-] | | The trouble with this is that I, at least, am trying to live in a society. And society has both rights and responsibilities. Sometimes you are forced to do things, or don’t do things, contrary to your desires. Every freedom has two sides, you can’t ignore the fact that increasing some freedoms for one decreases other freedoms for others. The shirt and shoes example is a great example in fact that illustrates the point. You don’t have unlimited freedom to not wear shoes, just like a business does not have unlimited freedom to impose whatever terms it likes, just because it put it in its ToS. | | |
| ▲ | Joker_vD 2 days ago | parent [-] | | > You don’t have unlimited freedom to not wear shoes Okay, I am gonna be 100% serious here: you absolutely should have such a freedom. Just as loitering or jaywalking being a crime is inherently totalitarian, what the hell. | | |
| ▲ | alistairSH 2 days ago | parent [-] | | In this case, unlimited means literally everywhere. You do have the right to go barefoot in your own home. And in true public spaces. But, a property owner can require shoes. Do I care if somebody is barefoot in the local grocer? No, not really. But, the proprietor might because they want to limit their liability (should something fall on your foot, a cart run it over, or a loose tack/nail somehow land in an aisle, etc). |
|
| |
| ▲ | alistairSH 2 days ago | parent | prev | next [-] | | Except the are companies with which you effectively must do business. Microsoft (or Apple). Any web host, payment processor, etc that's contracted to do work for your local government (I suppose you could try driving to the government office and pay by check, but then you need to give consent to Ford or Chevy). Short of living like a hermit, there's no practical way to avoid all ridiculous T&C. | |
| ▲ | dnnddidiej 2 days ago | parent | prev [-] | | Yes please. Your shaming didn't work. Free markets centre of gravity is biased towards capital and land owners. We need people power to balamce it back. Something we poor people are all enjoying now (pssst me and you are poor.... kings and barons are the few and rich) | | |
| ▲ | Joker_vD 2 days ago | parent [-] | | I really need to start putting /s at the ends of my comments where I merely restate the currently adopted legal theory/framework in non-sugar-coated terms, don't I? The whole liberal movement has its roots in the merchants' and industrialists' desire of having as little interference from the aristocracy-heavy governments of the yore, and it really shows even to this day. |
|
|
|
| |
| ▲ | inetknght 2 days ago | parent | prev | next [-] | | Not only that, but it should be illegal (eg: fines for the company and potential jail time for executives) for tying consent to use/purchase of services or products. Consent should be _voluntary_, not mandatory. | |
| ▲ | brazzy 2 days ago | parent | prev [-] | | You mean something like a... general regulation for data protection? |
| |
| ▲ | intended 2 days ago | parent | prev | next [-] | | A significant chunk of the infrastructure that farms data is now from private organizations, who sell that information because it is a source of revenue. Government is the bogeyman we are afraid of, but ad tech is doing the actual heavy lifting. | |
| ▲ | dzhiurgis 2 days ago | parent | prev | next [-] | | Yeah nah I’d rather stop the criminals. | | |
| ▲ | alex_young 2 days ago | parent [-] | | That’s cool. The precogs over at flock say you drive too close to the criminals though, and you know what that means. Stay loyal, stay safe citizens. | | |
| ▲ | dzhiurgis 2 days ago | parent [-] | | Criminals are long gone baby. Stop worshipping them. | | |
| ▲ | righthand 2 days ago | parent [-] | | The Potus is literally a pedophile, criminals are here to stay and winning. Your camera company supports them as long as they have money and/or control of the system. |
|
|
| |
| ▲ | eru 2 days ago | parent | prev | next [-] | | 100 miles around your border is a constitution free zone anyway. | | | |
| ▲ | Terr_ 2 days ago | parent | prev [-] | | Necessary, but not sufficient. Even if we somehow, perhaps via magic genie-wish, made the government totally disinterested... these systems would still enable dystopian levels of private surveillance and manipulation. |
|
|
| ▲ | neya 2 days ago | parent | prev | next [-] |
| This should ironically start at the VC level - and that includes YC et al. Some one comes and says "hey, we got this idea, we collect facial recognition data for training proprietary AI models", the response from the VC should "I'm gonna stop you right there. This is unethical." Not "Did you say I can 5x my ROI? Here, shut up and take my money!" |
| |
| ▲ | Joker_vD 2 days ago | parent | next [-] | | Capital eschews no profit, or very small profit, just as Nature was formerly
said to abhor a vacuum. With adequate profit, capital is very bold. A certain
10 per cent. will ensure its employment anywhere; 20 per cent. certain will
produce eagerness; 50 per cent., positive audacity; 100 per cent. will make it
ready to trample on all human laws; 300 per cent., and there is not a crime at
which it will scruple, nor a risk it will not run, even to the chance of its
owner being hanged. If turbulence and strife will bring a profit, it will freely
encourage both. Smuggling and the slave-trade have amply proved all that is
here stated.
We today can also add crypto schemes and mass surveillance to the examples.And mind you, VC are people who are both pretty good at earning money and also eager for even more money. That's how they got to where they are, after all, not necessarily by being virtuous (over a certain minimally required amount, or a social signaling of possessing such an amount). | |
| ▲ | kaliqt 2 days ago | parent | prev | next [-] | | YC is responsible for Sam Altman, need I say more? | | |
| ▲ | neya 2 days ago | parent [-] | | Fair enough. But, to be fair to them, they did have a falling out. There was a story on here how it went all the way to PG and then they asked Sam to leave (Something like that). I think saw it in a comment here, really don't remember. |
| |
| ▲ | jychang 2 days ago | parent | prev | next [-] | | When you put it that way, the answer is obvious: Assume VCs are brainless profit maximizers who don't understand ethics. How do you get them to say "I'm gonna stop you right there"? Answer: Make it unprofitable to collect this data. Change the incentives. So really, the correct answer IS on the legal level. Make a set of laws which make it burdensome at best and completely unprofitable at worst, and then the incentives within the system aligns. | | |
| ▲ | neya 2 days ago | parent [-] | | Agree with your point and the solution. Make it risky to operate - so that most VCs would wash their hands off due to legal risks. Kind of like what happened to the crypto space. But, it always gets worse before it gets better. Tons of rug pulls happened before SEC took action. |
| |
| ▲ | wnc3141 2 days ago | parent | prev [-] | | It seems like in the last number of years, VC has been prioritizing becoming the beneficiary of the whims of the regime. In short, I wonder if this has any implications about their confidence in startups' viability in private sectors |
|
|
| ▲ | 0x10ca1h0st 2 days ago | parent | prev | next [-] |
| This is great sentiment. Companies can be stopped, and then the medusa grows another head. Kill the business model, make the brokering of data illegal, and if caught, fines would be paid directly to those effected. This would go a long ways to promoting privacy first. |
|
| ▲ | Tangurena2 2 days ago | parent | prev | next [-] |
| One simple remedy would be to make companies (that collect such private data) and their directors/executives jointly & severally liable[0] for any identity theft. It should come with "forever" liability equivalent to SuperFund sites[1] Notes: 0 - Financial penalties would not be limited to "your share" of the penalty. If you have money, and the other parties don't, the plaintiffs can collect from whichever defendant has money. 1 - Everyone who ever owned the site with the toxic waste is liable for the cleanup. This is why when a gas station is sold (in the US), all of the fuel tanks are dug up and replaced - this way, none of the future leakage can be attributed to the previous owners. |
|
| ▲ | someothherguyy 2 days ago | parent | prev | next [-] |
| > we are quickly spiraling into the dystopia where privacy is gone we are essentially already in that dystopia. it is now more of a question of how bad it gets, and if the population will ever stand against it in any meaningful fashion. |
| |
| ▲ | GolfPopper 2 days ago | parent [-] | | Look at the silver lining - once the paperclip maximizers have crashed both modern civilization and the biosphere, it will be easy for any survivors to find privacy amid the metaphorical and actual ruins. |
|
|
| ▲ | 01100011 2 days ago | parent | prev | next [-] |
| There is a weird fetish with Flock right now. Privacy advocates have been screaming at the public for 25 years now and suddenly the public cares and is obsessed with this one very specific company. Nevermind license plate readers have been collecting your data for decades. Nevermind you literally carry a tracking device on your person, likely 24/7. I mean, cool, stop Flock, but don't stop there. Flock is very much not the final boss in this fight. The cynic in me says we will all get bored once Flock is off the radar though. |
| |
| ▲ | the_other 2 days ago | parent | next [-] | | It’s easier to consider responses or solutions to specific problems, rather than to solve for a broad, general principle. You can see this in many areas of life. Solving for general principles requires group effort (usually); you can get buy-in from individuals if you csn focus them on specific cases or subsets. So the Floxk fetish is reasonable at this point, IMO. There’s also nothing inherently wrong with carrying sensors in your pocket. The “wrong” is in the providers/manufacturers exploiting the position they put themselves in. Managing data is hard, and most people don’t want to do the necessary work most of the time, so the providers/manufacturers offer to do that for their users/customers. However, they also exploit their csretaker position by tresting the data like they own it too, and extracting profit. If the solutions to the Flock problems could be framed such that other providers/manufacturers had to build systems that were “local first” or “private by default” (as in pre-internet home computing plus explicit, finegrained shsring consents), then it would also be fine to carry sensors. I want my fitness tracker and GPS. I just don’t want the data it generates used to build advertising profiles on me such that ads (and government mass surveillance dragnets) can follow my every other move. | |
| ▲ | kortex 2 days ago | parent | prev | next [-] | | I don't think it's a "weird fetish." It's just most of the things privacy advocates have been warning about - PRISM, warrantless metadata requests, tech companies handing over data - are all largely invisible. A camera pointing at your child's playground or gymnastics class is much more salient. | |
| ▲ | nobody9999 2 days ago | parent | prev [-] | | >Nevermind license plate readers have been collecting your data for decades. Nevermind you literally carry a tracking device on your person, likely 24/7. While the above is a difference in scale, the various "credit bureaus" have been doing this stuff for much, much longer.[0][1][2] That's not to excuse the use of ALPRs and tracking on mobile devices. It's all really creepy and collection and trade in such should have strong negative incentives (company breaking fines, loss of corporate charter, jail time, etc.). In the meantime, one has to deal with at least some of this stuff unless you're willing to go live in a leanto in the woods. [0] https://en.wikipedia.org/wiki/Equifax#History [1] https://en.wikipedia.org/wiki/Experian#History [2] https://en.wikipedia.org/wiki/TransUnion#History |
|
|
| ▲ | itomato 2 days ago | parent | prev | next [-] |
| Without their unique business model, what is the company? The product? What is the addressable market for ubiquitous public surveillance devices? Who is the customer? |
|
| ▲ | King-Aaron 2 days ago | parent | prev | next [-] |
| > I don't want to stop Flock the company. I want to stop Flock the business model, along with all the other mass surveillance, and the data brokers. Then you want to stop the company. Which is reasonable. |
| |
| ▲ | ceejayoz 2 days ago | parent [-] | | Flock isn’t the only company. | | |
| ▲ | jojobas 2 days ago | parent [-] | | One company getting destroyed could be a sign for others. | | |
| ▲ | NegativeK 2 days ago | parent [-] | | The sign won't be "don't do mass surveillance". It'll be "make money off mass surveillance, but don't get caught like Flock did." |
|
|
|
|
| ▲ | NegativeK 2 days ago | parent | prev | next [-] |
| Make HIPAA include PII. That hits your toxic waste goal real fast. |
| |
|
| ▲ | boriskourt 2 days ago | parent | prev | next [-] |
| If we can set a legal precedent then this can cascade into policy, or an enforceable standard much faster. |
|
| ▲ | 2 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | heyethan 2 days ago | parent | prev [-] |
| [dead] |