| ▲ | nebezb 5 hours ago |
| I’ve spent just a teeny bit of time helping international ICE investigators (not that one; internet child exploitation) postpone PTSD with technology. It seems like after two years of their job, they’re going to have a mental break. So postponing is all you can really do. It’s disheartening how underfunded these agencies are compared to, what feels like at least, the severity of the crimes they’re up against. These folks are heroes. This is one place AI has a lot of potential (but very little commercial value). |
|
| ▲ | Gigachad 5 hours ago | parent | next [-] |
| Moderation feels like the one of the most ethical uses of AI. Being able to prevent a lot of the worst content from being posted and preventing people from being exposed to it. |
| |
| ▲ | throwaway290 2 hours ago | parent [-] | | Sure... so now we end up with people watching abuse 9 to 5 to train AI. https://www.theguardian.com/global-development/2026/feb/05/i... | | |
| ▲ | Gigachad 31 minutes ago | parent | next [-] | | I don't think the issue here is related to AI. Without AI, moderators would still have to look at these same videos. The difference is they would hit the public first before being flagged and sent to moderators. Now with AI they can be prevented from ever going public. The fact that we still need to traumatize workers to confirm the automated decisions is sad. The only other ways I can see to resolve this would be either to just blindly trust the AI result without any human oversight, or to require all facebook users to link their government ID to accounts and only allow posting by users in countries where the authorities arrest the people posting these things. | |
| ▲ | emsign 41 minutes ago | parent | prev [-] | | Outsourcing everything. Even PTSD from training AI to India so privileged law enforcement officers and social media moderators don't have to. This system is so hypocritical and broken. |
|
|
|
| ▲ | itishappy 4 hours ago | parent | prev | next [-] |
| Another comment mentioned ICE as well, so I've been looking into it, and imagine my surprise to learn that ICE (yes that one) has been working in this space since since the Obama admin. Huh. https://www.ice.gov/careers/hero https://en.wikipedia.org/wiki/Justice_for_Victims_of_Traffic... |
| |
| ▲ | palmotea 2 hours ago | parent | next [-] | | Yeah, I looked into it, and ICE actually has two distinct components: Homeland Security Investigations (HSI) and Enforcement and Removal Operations (ERO). Pretty much everyone things ICE == ERO, so you've got stuff like Canadians agitating to close the HSI collaboration offices in Canada. | |
| ▲ | leoqa 3 hours ago | parent | prev | next [-] | | HSI was primarily the main investigative body responsible for human traffic and crimes against children prior to this administration. The second largest federal investigative agency behind the FBI (6k agents). Now doing immigration enforcement. | | |
| ▲ | zo1 7 minutes ago | parent [-] | | It's unfortunate that they are being repurposed to fix a problem entirely generated purposefully for political gain. Those individuals should never have been allowed to flood the system and take effort away from true egregious victims and crime. |
| |
| ▲ | zdragnar 3 hours ago | parent | prev [-] | | Coyotes are frequently part of criminal organizations. They take advantage of people in any and every way that they can. Slavery, sexual and otherwise, is not at all an uncommon result of being brought into the country under the radar, so to speak. |
|
|
| ▲ | meowface an hour ago | parent | prev [-] |
| AI for helping mitigate PTSD, or helping with the investigations? Because the latter basically entails helping create a surveillance state. Which in theory could be an acceptable trade-off, but it seems disingenuous to say "AI companies have no financial incentives here" when the big issue is that AI companies would actually be helping to establish powerful dragnet surveillance capabilities. There would need to be a strong democratic process around this. |