Remix.run Logo
espadrine 4 hours ago

When Meta prevented the EU from using meta.ai or even downloading its vision models, I sunk my head in the AI legistation.

Here, I am honestly not sure which part they rely on, to say that what they made might be unlawful.

The closest thing I found for Meta was that “emotion recognition systems” are classified as high-risk (paragraph 54), and high-risk systems must have their training data disclosed (Art 11(1))[0]. In theory, you could upload photos to meta.ai and ask it what emotions are displayed, but it is already a stretch. For GenChess, I’m at a loss; it doesn’t sound like you can do that. (Not that it prevented any vision chatbot from releasing.)

If someone has a better guess as to why they might have restrained it here, I am curious.

[0]: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A...

sunbum 2 hours ago | parent [-]

It's simple. Big tech doesn't like regulation. By pretending the regulation won't let them release "all these cool things" they can turn public opinion against regulation.

lesuorac an hour ago | parent [-]

See: Cookie Banners.

Nothing in the law requires a banner. It could even be handled in the browser by letting people choose what third-party cookies to accept (or none, hence the problem) and having that be negotiated during page load.

It's nice the law is being interpreted to require to be as easy to reject all local storage of other's data as it is to accept all local storage of other's data.