| ▲ | qoez 8 hours ago | ||||||||||||||||
Does regulators really care about a predicted age? I feel like they require hard proof of being above age to show explicit content. The only ones that care about predicted age is advertisers. | |||||||||||||||||
| ▲ | Retr0id 8 hours ago | parent | next [-] | ||||||||||||||||
In the UK, age verification just has to be "robust" (not further specified). Face scanning, for example, is explicitly designated as an allowed heuristic. | |||||||||||||||||
| ▲ | Imustaskforhelp 8 hours ago | parent | prev | next [-] | ||||||||||||||||
It's not much for the regulators as much as its for the advertisers. At this point, just use gemini (yes its google and has its issues if you need SOTA) or I have recently been trying out more and more chat.z.ai for simple text issues (like hey can you fix this docker issue etc.) and I feel like chat.z.ai is pretty good plus open source models (honestly chat.z.ai feels pretty SOTA to me) | |||||||||||||||||
| |||||||||||||||||
| ▲ | heliumtera 8 hours ago | parent | prev [-] | ||||||||||||||||
Any proof that you are above a certain age will also expose you identity. That is the only reason regulators care about children safety online, because they care about ID. LLMs are very good at profiling users in hacker news and finding alt accounts, for example. profiling is the best use case for llms. So there you go, maybe it wont give exactly what regulators say they want, but it will give exactly what they truly want. | |||||||||||||||||