Remix.run Logo
satellite2 14 hours ago

Interesting. When it's the state I think the overwhelming opinion is that predictive policing is dangerous but when it's a private company we actually want it to enforce it?

itsdesmond 14 hours ago | parent [-]

They could not be held accountable to warn her if they had not done the analysis. They did. Their organizational conclusion was that it was potentially an unsafe trip. Shit, they could have just cancelled the ride dynamically and re-assigned her. Why wouldn’t they do that? It’d probably be more expensive. Maybe they’d get more cancelled rides. Maybe this woman wouldn’t have been raped by an agent of Uber selected for and sent to her by them.

satellite2 14 hours ago | parent [-]

Wouldn't they then expose themselves to discrimination and loss of revenue lawsuits from targeted drivers?

itsdesmond 14 hours ago | parent [-]

It depends. Are the inputs to the algorithm themselves discriminatory? If so, then yes that would be appropriate. But that is a different conversation. They determined the passenger may be unsafe and did nothing.

Mind you, these companies work very hard for us to not know how they match A to B, usually so we don’t notice things like their disregard for safety.

kyleee 5 hours ago | parent [-]

The inputs wouldn’t even matter; the inputs could even be above reproach but if there were disparate impacts in terms of outcomes, the case for liability could be made.