| ▲ | r2vcap 5 hours ago | |
Well, it’s a clever idea. Discord seems to have intentionally softened its age-verification steps so it can tell regulators, “we’re doing something to protect children,” while still leaving enough wiggle room that technically savvy users can work around it. But in practice, this only holds if regulators are either inattentive or satisfied with checkbox compliance. If a government is competent and motivated, this approach won’t hold up—and it may even antagonize regulators by looking like bad-faith compliance. I’ve also heard that some governments are already pushing for much stricter age-verification protocols, precisely because people can bypass weaker checks—for example, by using a webcam with partial face covering to confuse ID/face matching. I can’t name specific vendors, but some providers are responding by deploying stronger liveness checks that are significantly harder to game. And many services are moving age verification into mobile apps, where simple JavaScript-based tricks are less likely to work. | ||
| ▲ | tyre 2 hours ago | parent [-] | |
> Discord seems to have intentionally softened its age-verification steps so it can tell regulators, “we’re doing something to protect children,” while still leaving enough wiggle room that technically savvy users can work around it. ...source? I sincerely doubt that Discord's lawyers advocated for age verification that was hackable by tech savvy users. It seems more likely that they are trying to balance two things: 1. Age verification requirements 2. Not storing or sending photos of people's (children's) faces Both of these are very important, legally, to protect the company. It is highly unlikely that anyone in Discord's leadership, let alone compliance, is advocating for backdoors (at least for us.) | ||