| |
| ▲ | ldoughty 5 hours ago | parent | next [-] | | But the data collected is property of the government and flock is not allowed to use that data for additional business gain (according to their statements)... So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors. If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests" | | |
| ▲ | valeriozen 3 hours ago | parent | next [-] | | The AWS analogy breaks down because AWS doesn't encourage customers to pool their S3 buckets into a nationwide searchable index. Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to? | | |
| ▲ | giancarlostoro 3 hours ago | parent [-] | | Start standing in front of the cameras looking sketchy long enough till police are sent out to ya, then ask the cop who called. | | |
| ▲ | chaps 2 hours ago | parent [-] | | Someone once dropped some fireworks not too far from me at 3am a few years back. They were loud and, yeah, cops were called. A few minutes later about five cars drive past me about 30mph over the limit. Not sure how they didn't see me or try to see me. But I know they didn't catch the BRIGHT orange and lifted care. Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs. Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back. He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp. | | |
| ▲ | giancarlostoro 12 minutes ago | parent [-] | | I may or may not know a business owner who got criminals off their business' street by saying he thinks he saw a gun any time criminals showed up to do things, everything from prostitution to selling drugs. Cops showed up immediately. They stopped coming by altogether, probably the safest street in quite a rough part of town. It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them. |
|
|
| |
| ▲ | jnovek 3 hours ago | parent | prev | next [-] | | I don’t care. I don’t care who owns the data. If I can’t easily get private information like my movements removed from a database like this, the legislation does not sufficiently protect me. It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop. | | |
| ▲ | lazide 3 hours ago | parent | next [-] | | A reasonably nuanced defense could likely claim that to be able to do what you want, would have much worse side effects on privacy. For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units? For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes. Now, it's not the same thing of course - but hopefully you understand what I'm referring to? | | |
| ▲ | LadyCailin 2 hours ago | parent [-] | | Except that the analogy is that they already have, or can easily create, that list. If they couldn’t, their value proposition would be lame. “We know you’re looking for a specific license plate, here’s a million hours of footage from all over the city, have at looking through it all.” | | |
| ▲ | lazide 2 hours ago | parent [-] | | Only for paying customers, which you aren't of course. If those customers paid public storage to inventory their stuff, then that inventory is their property. Surely it would be inappropriate to use their inventory data to find your naked photos. A violation of privacy even. (/s, kinda) I was enumerating the likely defense, not that it's valid. |
|
| |
| ▲ | tptacek 3 hours ago | parent | prev [-] | | The law cares about lots of things we don't care about. |
| |
| ▲ | tptacek 5 hours ago | parent | prev | next [-] | | This is also true according to their contracts (we were one of the first munis in the country to ostentatiously cancel our Flock contract, and the lead up to that was a bunch of progressive legal experts poring over that contract looking for holes.) | | |
| ▲ | fsckboy 4 hours ago | parent [-] | | >a bunch of progressive legal experts poring over that contract looking for holes all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you. if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges. | | |
| ▲ | shermantanktop 3 hours ago | parent [-] | | As a practical matter, this may be good advice. But it also places a demand on someone with a legitimate concern that they go find an ideological "beard" to make themselves more palatable and sympathetic. It's not hard to see how this enables an institution to gate itself from criticism. |
|
| |
| ▲ | eagleinparadise 2 hours ago | parent | prev | next [-] | | If I lease out a property to a tenant (apartment, retail, industrial use, whatever) and that tenant is committing an illegal activity on the property. Would the landlord be liable for knowing it? Or not? "Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them." Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge. | |
| ▲ | thaumaturgy 4 hours ago | parent | prev | next [-] | | Except that Flock very clearly benefits financially from having direct access to this data: owning (and in their own documentation, they very clearly do own it) a network of 80,000 surveillance devices across the country, and owning every single transit point for the data they collect, is what gets them to a $7.5 billion valuation from investors. The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data. (After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.) Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder? | | |
| ▲ | necovek 4 hours ago | parent [-] | | That makes them a data broker in my reading, and at least in California, Data Broker legislation should apply. CA Data Broker registry gives me access denied, but that could be because I am outside US. | | |
| ▲ | ScoobleDoodle 4 hours ago | parent [-] | | I looked it up at https://cppa.ca.gov/data_broker_registry/ and didn't find Flock / Flock Safety in that list of the currently registered 566 data brokers. | | |
| ▲ | tptacek 4 hours ago | parent [-] | | Because Flock isn't a data broker. Flock's customers own their data, not Flock, and they use Flock's platform voluntarily to share data with other customers. | | |
| ▲ | tadfisher 3 hours ago | parent | next [-] | | Flock charges to access the data which is voluntarily shared by other customers. I am struggling to note a difference in this practice from any other data brokerage service in existence. Does Flock do some kind of P2P dance to avoid the data transiting their systems? | |
| ▲ | necovek 4 hours ago | parent | prev | next [-] | | I was referring to the claim that "Flock's cameras collect more data than is provided to police agencies" — that suggests that there is data not "owned" by the customers, which implies it's Flock's data, thus it might make them liable under Data Broker legislation. | |
| ▲ | cwillu 4 hours ago | parent | prev | next [-] | | Equivocation. My stock broker doesn't own my stocks either, they merely hold my assets in a brokerage account. | | |
| ▲ | tptacek 4 hours ago | parent | next [-] | | I encourage you to present that analogy to an actual court and see how far it gets you. It's very easy to find the statutory definition of a "data broker" under California law. This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way? | |
| ▲ | jaredwiener 3 hours ago | parent | prev [-] | | And you would (rightfully) be angered if your stock broker sold your shares and pocketed the proceeds, because you own them. |
| |
| ▲ | 3 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | close04 3 hours ago | parent | prev | next [-] | | So… Flock uses their own platform and top to bottom tech stack to do everything technically? Your local PD doesn’t use random cameras (like Reolink), doesn’t run a custom software stack (like Frigate in a container on some random VM hosted with AWS), doesn’t store the data wherever (like Backblaze)? The customers just have to install the Flock cameras and “order” the subsequent data from Flock? But you say they’re not at all responsible or accountable for any it because despite doing everything at every step, they’re “just a broker”? | |
| ▲ | unethical_ban 3 hours ago | parent | prev [-] | | If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad. I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies. Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another. |
|
|
|
| |
| ▲ | 5 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | unethical_ban 3 hours ago | parent | prev [-] | | This is worth validating independently, but to be clear: Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's? If so, that makes the panopticon slightly less powerful. |
| |
| ▲ | mminer237 5 hours ago | parent | prev [-] | | If you go to Rent-A-Center and rent a DSLR, that doesn't make Rent-A-Center responsible for the pictures taken by their cameras. | | |
| ▲ | yabutlivnWoods 5 hours ago | parent | next [-] | | Your example is apples and oranges. Flock maintains private infrastructure that stores data. If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps. As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control. | | |
| ▲ | tptacek 5 hours ago | parent [-] | | AWS also maintains private infrastructure that stores data. Go write them asking to purge data pertaining to you from S3 and see how that goes. | | |
| ▲ | itsdesmond 4 hours ago | parent | next [-] | | Flock has knowledge/use of the data. Their system processes can relate the photos “owned” by two different entities. They’re interacting with it and selling their access to it as a feature. That’s obviously distinct from S3. But you knew that. | | |
| ▲ | tptacek 4 hours ago | parent [-] | | I know quite a bit about Flock, having been intimately involved in the process of evicting it from our municipality, and I don't think the distinction you're trying to draw here is meaningful. Flock will say they provide a service, one avidly sought by the actual owners of the data, to generate analysis based on that data. They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case. | | |
| ▲ | dureuill 2 hours ago | parent [-] | | Except their customer's data isn't actually theirs: OP requested their private data to be deleted from the system. So OP expressed a clear intent for their data not to be used by Flock's customer. We could say that the data thus becomes abusively retained on these systems. As a result, IF Flock has the technical means of performing the requested data deletion, it should be compelled to perform it. This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content. Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP. Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform. | | |
| ▲ | tptacek 2 hours ago | parent [-] | | I do not believe this is how the law works. Two totally different regimes. |
|
|
| |
| ▲ | yabutlivnWoods an hour ago | parent | prev | next [-] | | I don't live in a state with a law like California's so your "gotcha" isn't relevant. Californians would have standing under the law but need expensive lawyers to litigate. AWS has employed expensive lawyers to argue semantics; they host OS VMs and databases. This provides them legal cover for what AWS customers store. Amazon the retailer stores customer data. A non-customer would have standing under California law to litigate removal of PII should they decide to hire lawyers. Your reductionism is to law what a Linux beige box on a routable IP, no firewall, hosting a production health database with creds set to admin/pwd1234 is to software engineering. Coincidentally 1234 happens to be the code to my luggage. | |
| ▲ | danudey 4 hours ago | parent | prev | next [-] | | If AWS maintained private infrastructure that stored and indexed data associated with people's license plates and vehicles and then charged customers to do searches against that data then yes, you could write them to ask them to purge data pertaining to you. If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools. | | |
| ▲ | tptacek 4 hours ago | parent [-] | | The argument you're making implicates way more than just Flock, and is in a practical sense novel. If you can cite jurisprudence (or even legal experts) backing it up, I'm interested in reading it. Otherwise, I'm happy to accept that we just have premises about the law that are too far apart for an argument to be productive. My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is. | | |
| ▲ | Karrot_Kream 3 hours ago | parent [-] | | Realistically speaking you're never going to get pro Flock people in any numbers on this site writing comments at all. The anti surveillance position's popularity when it comes to up votes, down votes, and flags on this site is such that pros will continue posting about what they want the law to be and antis will stay out. That's just how crowd voting dynamics shape out. |
|
| |
| ▲ | 4 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | 4 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | 4 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | Mordisquitos 4 hours ago | parent | prev [-] | | Does AWS actively and by design parse and keep track of personally identifiable information of the data that AWS customers store on their S3 buckets? If that were the case they would absolutely be subject to CCPA (and GDPR) requests for deletion. However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it. |
|
| |
| ▲ | danielsunsu 5 hours ago | parent | prev | next [-] | | I think if it were only offline storage it would not be as big of an issue. A more accurate analogy would be renting a DSLR that automatically transmits every picture to Rent-A-Center servers. | |
| ▲ | kstrauser 5 hours ago | parent | prev [-] | | If Rent-A-Center installed the camera in a bathroom, I'd contend that it does. Flock's cameras aren't in bathrooms. However, they're still recording people who haven't opted into it. ("But you have no expectation of privacy in a public place!" "You have the expectation that someone might inadvertently overhear you. You don't have the expectation that someone is actively recording you at all times.") | | |
|
|