| ▲ | mirzap 3 days ago | |||||||||||||
Yup, I agree. And this is why I think mass surveillance isn’t just another technology to regulate. The chilling effect isn’t misuse; it’s the default: continuous, opaque observation changes behavior by itself. Because it’s centralized and unavoidable, people self-censor and conform; you don’t need arrests once everyone assumes they’re being scored. We don’t yet have long-run examples of fully algorithmic surveillance societies, so the outcome isn’t certain. But if these dynamics scale, the risk is trading experimentation for legibility. Problems get hidden, metrics look clean, and warning signals vanish. When real stress hits, responses are late and blunt - overcorrection, cascading failures, accelerated exit. Stability holds until it doesn’t. | ||||||||||||||
| ▲ | stevenjgarner 3 days ago | parent [-] | |||||||||||||
I think especially heinous is the use of Zero-Knowledge (ZK) Proof technologies where a centralized attestation authority (e.g. government agency) verifies compliance, and the verifier (e.g. business needing to prove compliance) relies on the ZK cryptographic proof of compliance without revealing the individual. This revocable privacy can unmask the real identity in the case of asserted "suspicious" activity. This is the current direction of mainstream technology, and all it serves to accomplish is a normalization of loss of privacy and anonymity. | ||||||||||||||
| ||||||||||||||