Remix.run Logo
swiftcoder 4 hours ago

One of the bigger commercial niches for smart glasses is filming POV porn, so it is hardly surprising that sort of content ended up in the moderation queue. The project should have planned to account for that use case.

swiftcoder 3 hours ago | parent | next [-]

And I do appreciate how awkward it is for Meta to admit that use case exists. Even in the Oculus Go days there were a bunch of polite euphemisms internally to avoid mentioning "our device has to ship with a browser so people can watch porn on it"

hosteur 3 hours ago | parent | prev | next [-]

Why is there even a “ moderation queue”? Isn’t this people’s private recordings?

dylan604 3 hours ago | parent | next [-]

This is my question too. I get moderating things that people are posting. Being not familiar with the device and how it works, I'd assume that all footage is posted to the user's cloud account even if not publicly posted. This being cloud storage, Meta is "moderating" the footage to ensure CSAM or other restricted footage type is not being stored on their (Meta's) platform. That's my very generous take on it, not that I believe it

inerte 2 hours ago | parent | prev | next [-]

Yes but also we don't want people live streaming murder and suicide, so there's detection and moderation in place.

jdiff an hour ago | parent [-]

Private recordings aren't public live streams.

intended 3 hours ago | parent | prev [-]

I’m betting this is going to some ML / Data labelling pipeline.

swiftcoder 3 hours ago | parent [-]

Yeah, moderation may instead be labelling in this case. Its likely the same type of firm handles both sorts of work on behalf of FAANG

intended 2 hours ago | parent [-]

Sounds plausible.

We could also toss vibe coded mess on top of this and probably get closer to the truth.

swiftcoder 2 hours ago | parent [-]

The article itself is ambiguous on this point: "At the time of the publication, Meta admitted subcontracted workers might sometimes review content filmed on its smart glasses when people shared it with Meta AI."

That could be moderation, or it could be labelling new examples for training/validation

intended an hour ago | parent [-]

This feels like an instance of weasel words. One can scarcely imagine any reason to do content moderation over people’s own private and personally consumed data.

ozozozd 2 hours ago | parent | prev [-]

How do you moderate what people do? You send someone to stop them from having sex because it was streamed to your servers?