| ▲ | avalys 18 hours ago | |
That’s not how courts and laws work. If you cheat and use other illegal factors to compute a premium for each person, and then create an AI model that effectively looks up your illegally calculated premium of each person by their location, they’re going to reach the obvious conclusion - you are calculating a premium using illegal factors. | ||
| ▲ | kyboren 16 hours ago | parent [-] | |
Like I said: It's only useful to have location in your model's domain if you have side-channel information embedded in the model function itself about what those location data mean for the correct premium price. What we're talking about here is just a way to embed much more information in your model function than a human reasonably could. Given the magnificent ability of DNN models to serve as obfuscatory black boxes and the general techno-ignorance of legislators and regulators, I suspect that "AI laundering" your violations actually a very effective way to juke all sorts of laws and regulations. But both of us are just speculating. If you have insider industry knowledge or can point to regulatory guidance and/or enforcement actions in this area that corrects or confirms my understanding, I would love to read about it. | ||