Remix.run Logo
yabutlivnWoods 5 hours ago

Your example is apples and oranges. Flock maintains private infrastructure that stores data.

If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps.

As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control.

tptacek 5 hours ago | parent [-]

AWS also maintains private infrastructure that stores data. Go write them asking to purge data pertaining to you from S3 and see how that goes.

itsdesmond 4 hours ago | parent | next [-]

Flock has knowledge/use of the data. Their system processes can relate the photos “owned” by two different entities. They’re interacting with it and selling their access to it as a feature. That’s obviously distinct from S3.

But you knew that.

tptacek 4 hours ago | parent [-]

I know quite a bit about Flock, having been intimately involved in the process of evicting it from our municipality, and I don't think the distinction you're trying to draw here is meaningful. Flock will say they provide a service, one avidly sought by the actual owners of the data, to generate analysis based on that data.

They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.

dureuill 2 hours ago | parent [-]

Except their customer's data isn't actually theirs: OP requested their private data to be deleted from the system. So OP expressed a clear intent for their data not to be used by Flock's customer. We could say that the data thus becomes abusively retained on these systems. As a result, IF Flock has the technical means of performing the requested data deletion, it should be compelled to perform it.

This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content.

Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP.

Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform.

tptacek 2 hours ago | parent [-]

I do not believe this is how the law works. Two totally different regimes.

yabutlivnWoods an hour ago | parent | prev | next [-]

I don't live in a state with a law like California's so your "gotcha" isn't relevant.

Californians would have standing under the law but need expensive lawyers to litigate.

AWS has employed expensive lawyers to argue semantics; they host OS VMs and databases. This provides them legal cover for what AWS customers store.

Amazon the retailer stores customer data. A non-customer would have standing under California law to litigate removal of PII should they decide to hire lawyers.

Your reductionism is to law what a Linux beige box on a routable IP, no firewall, hosting a production health database with creds set to admin/pwd1234 is to software engineering.

Coincidentally 1234 happens to be the code to my luggage.

danudey 4 hours ago | parent | prev | next [-]

If AWS maintained private infrastructure that stored and indexed data associated with people's license plates and vehicles and then charged customers to do searches against that data then yes, you could write them to ask them to purge data pertaining to you.

If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools.

tptacek 4 hours ago | parent [-]

The argument you're making implicates way more than just Flock, and is in a practical sense novel. If you can cite jurisprudence (or even legal experts) backing it up, I'm interested in reading it. Otherwise, I'm happy to accept that we just have premises about the law that are too far apart for an argument to be productive.

My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is.

Karrot_Kream 3 hours ago | parent [-]

Realistically speaking you're never going to get pro Flock people in any numbers on this site writing comments at all. The anti surveillance position's popularity when it comes to up votes, down votes, and flags on this site is such that pros will continue posting about what they want the law to be and antis will stay out. That's just how crowd voting dynamics shape out.

4 hours ago | parent | prev | next [-]
[deleted]
4 hours ago | parent | prev | next [-]
[deleted]
4 hours ago | parent | prev | next [-]
[deleted]
Mordisquitos 4 hours ago | parent | prev [-]

Does AWS actively and by design parse and keep track of personally identifiable information of the data that AWS customers store on their S3 buckets? If that were the case they would absolutely be subject to CCPA (and GDPR) requests for deletion.

However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it.