| ▲ | u12 4 hours ago | |
> Meta required users to be caught 17 times attempting to traffic people for sex before it would remove them from its platform, which a document described as “a very, very, very high strike threshold." I don’t get it. Is sex trafficking driven user growth really so significant for Meta that they would have such a policy ? | ||
| ▲ | aprilthird2021 3 hours ago | parent | next [-] | |
The "catching" is probably some kind of automated detection scanner with an algo they don't fully trust to be accurate, so they have some number of "strikes" that will lead to a takedown. | ||
| ▲ | SpicyLemonZest 3 hours ago | parent | prev | next [-] | |
Of course it's not. We could speculate about how to square this with reason and Meta's denial; perhaps some flag associated with sex trafficking had to be hit 17 times, and some people thought the flag was associated with too many other things to lower the threshold. But the bottom line is that hostile characterizations of undisclosed documents aren't presumptively true. | ||
| ▲ | delis-thumbs-7e 2 hours ago | parent | prev [-] | |
We don’t know. But as you read from the article, Meta’s own employees were concerned about it (and many other things). For Zuck it was not a priority, as he said himself. We can speculate. I think they just did not give a fuck. Usually limiting grooming and abuse of minors requires limiting the access of those minors to various activities on the platform, which means those kids go somewhere else. Meta specifically wanted to promote it’s use among children below 13 to stimulate growth, that often resulting in the platform becoming dangerous for minors was not seen as their problem. If your company is driven by growth über alles à la venture capitalism, it will mean the growth goes before everything else. Including child safety. | ||