| ▲ | codingdave 8 hours ago |
| IANAL, but this seems like an odd test to me. Judges do what their name implies - make judgment calls. I find it re-assuring that judges get different answers under different scenarios, because it means they are listening and making judgment calls. If LLMs give only one answer, no matter what nuances are at play, that sounds like they are failing to judge and instead are diminishing the thought process down to black-and-white thinking. Digging a bit deeper, the actual paper seems to agree: "For the sake of consistency, we define an “error” in the same way that Klerman and Spamann do in their original paper: a departure from the law. Such departures, however, may not always reflect true lawlessness. In particular, when the applicable doctrine is a standard, judges may be exercising the discretion the standard affords to reach a decision different from what a surface-level reading of the doctrine would suggest" |
|
| ▲ | scottLobster 8 hours ago | parent | next [-] |
| Yeah, I'm reminded of the various child porn cases where the "perpetrator" is a stupid teenager who took nude pics of themselves and sent them to their boy/girlfriend. Many of those cases have been struck down by judges because the letter of the law creates a non-sequitur where the teenager is somehow a felon child predator who solely preyed on themselves, and sending them to jail and forcing them to sign up for a sex offender registry would just ruin their lives while protecting nobody and wasting the state's resources. I don't trust AI in its current form to make that sort of distinction. And sure you can say the laws should be written better, but so long as the laws are written by humans that will simply not be the case. |
| |
| ▲ | Lerc 7 hours ago | parent | next [-] | | This is one of the roles of justice, but it is also one of the reasons why wealthy people are convicted less often. While it often delivered as a narrative of wealth corrupting the system, the reality is that usually what they are buying is the justice that we all should have. So yes, a judge can let a stupid teenager off on charges of child porn selfies. but without the resources, they are more likely be told by a public defender to cop to a plea. And those laws with ridiculous outcomes like that are not always accidental. Often they will be deliberate choices made by lawmakers to enact an agenda that they cannot get by direct means. In the case of making children culpable for child porn of themselves, the laws might come about because the direct abstinence legislation they wanted could not be passed, so they need other means to scare horny teens. | | |
| ▲ | Terr_ 6 hours ago | parent | next [-] | | > what they are buying is the justice From The Truth by Terry Pratchett, with particular emphasis on the book's footnote. > William’s family and everyone they knew also had a mental map of the city that was divided into parts where you found upstanding citizens, and other parts where you found criminals. It had come a shock to them... no, he corrected himself, it had come as a an affront to learn that [police chief] Vimes operated on a different map. Apparently he'd instructed his men to use the front door when calling on any building, even in broad daylight, when sheer common sense said that they should use the back, just like any other servant. [0] > [0] William’s class understood that justice was like coal or potatoes. You ordered it when you needed it. | |
| ▲ | scottLobster 6 hours ago | parent | prev | next [-] | | Sure, but I'm not sure how AI would solve any of that. Any claims of objectivity would be challenged based on how it was trained. Public opinion would confirm its priors as it already does (see accusations of corruption or activism with any judicial decision the mob disagrees with, regardless of any veracity). If there's a human appeals process above it, you've just added an extra layer that doesn't remove the human corruption factor at all. As for corruption, in my opinion we're reading some right now. Human-in-the-loop AI doesn't have the exponential, world-altering gains that companies like OpenAI need to justify their existence. You only get that if you replace humans completely, which is why they're all shilling science fiction nonsense narratives about nobody having to work. The abstract of this paper leans heavily into that narrative | |
| ▲ | FarmerPotato 7 hours ago | parent | prev | next [-] | | Oddly enough, Texas passed reform to keep sexting teens from getting prosecuted when: they are both under 18 and less than two years difference in age. It was regarded as a model for other states. It's the only positive thing I have heard of Texas legislating wrt sexuality. | | |
| ▲ | quantified 6 hours ago | parent | next [-] | | Lawmakers have teenagers in their own families, apparently. Not just someone else's problem. | |
| ▲ | thaumasiotes 5 hours ago | parent | prev [-] | | > It was regarded as a model for other states. Really? That "model" has the common, but obviously extremely undesirable, feature of criminalizing sexual relationships between students in the same grade that were legal when they formed. How could it be regarded as a model for anyone else? | | |
| ▲ | samrus 4 hours ago | parent [-] | | You might have misread it. Texas' model is decriminalizing teens sexting, not criminalizing it | | |
| ▲ | thaumasiotes an hour ago | parent [-] | | I didn't misread it, but apparently you did. Why is criminalizing an existing legal relationship a good idea? |
|
|
| |
| ▲ | SXX 4 hours ago | parent | prev [-] | | Whole "democracy" thing is legal framework that wealthy and powerful people built to make safe wealth transfer down the generations possible while giving away as little as possible to average joe. In a countries without this legal framework its usually free for all fight every time ruling power changes. Not good for preserving capital. So wealthy having more rights is system working as intended. Not inherently bad thing either as alternative system is whoever best with AK47 having more rights. | | |
| ▲ | FpUser 4 hours ago | parent [-] | | >"So wealthy having more rights is system working as intended. Not inherently bad thing either" Sorry but I do not feel this way. "Not inherently bad thing either" - I think it is maddening and has to be fixed no matter what. You know, wealthy generally do not really do bad in dictatorial regimes either. | | |
| ▲ | SXX 4 minutes ago | parent [-] | | > "You know, wealthy generally do not really do bad in dictatorial regimes either." Until they found dead with unexpected heart attack, their car blow up or they fall out of the window. In dictatorship vast majority of wealthy people no more than managers of dictators property. Usually with literal golden cages that impossible to sell and transfer. Once person fall out of favor or stop being useful all their "wealth" just going to be redistributed because it was never theirs. |
|
|
| |
| ▲ | btilly 4 hours ago | parent | prev | next [-] | | While some cases have been struck down, about 1/4 of people on the sex offender registry were minors at the time of the offense, 14 is the age at which it is most likely to happen, and this exact scenario accounts for a significant fraction of cases. Common sense does not always get to show up. | |
| ▲ | wvenable 7 hours ago | parent | prev | next [-] | | There have been equally high profile cases where a perpetrator got off because they have connections. I'd love for an AI to loudly exclaim that this is a big deviation from the norm. | |
| ▲ | torginus 32 minutes ago | parent | prev | next [-] | | Man, this is one of the ways society has fundamentally broken - all the 'think of the children' arguments, resting on the belief that children are so sacred, that any sort of leinency or consideration of circumstances is forbidden - lest someone guilty of molesting them might walk free. Well now we know for a fact that some of the people making these arguments very thinking of the children very much. | |
| ▲ | a13n 6 hours ago | parent | prev | next [-] | | This example feels more like a bug in the law itself that should be corrected. If this behavior is acceptable then it should be legal so we can avoid everyone the hassle in the first place. I bet AI would be great at finding and fixing these bugs. | | |
| ▲ | simondotau a minute ago | parent | next [-] | | I think "judge AI" would be better if it also had access to a complete legislative record of debate surrounding the establishment of said laws, so that it could perform a "sanity check" whether its determinations are also consistent with the stated intent of lawmakers. One might imagine a distant future where laws could be dramatically simplified into plain-spoken declarations, to be interpreted by a very advanced (and ideally true open source) future LLM. So instead of 18 U.S.C. §§ 2251–2260 the law could be as straightforward as: A child must never be used, displayed, or represented as an object of sexual gratification. Responsibility extends to those who assist, enable, profit from, or intentionally access such material. The aim of this prohibition is to protect children from sexual exploitation and to eliminate any market or incentive for it. | |
| ▲ | chmod775 2 hours ago | parent | prev | next [-] | | > If this behavior is acceptable then it should be legal so we can avoid everyone the hassle in the first place. Codifying what is morally acceptable into definitive rules has been something humanity has struggled with for likely much longer than written memory. Also while you're out there "fixing bugs" - millions of them and one-by-one - people are affected by them. > I bet AI would be great at finding and fixing these bugs. Ae we really going to outsource morality to an unfeeling machine that is trained to behave like an exclusive club of people want it to? If that was one's goal, that's one way to stealthily nudge and undermine a democracy I suppose. | |
| ▲ | ohyoutravel 6 hours ago | parent | prev | next [-] | | There are no “bugs” in human institutions like law. There are always going to be edge cases and nuances that require a human to evaluate. | |
| ▲ | AuryGlenz 3 hours ago | parent | prev | next [-] | | It's not a bug, it's something politicians don't want to touch because nobody wants to be the person that is soft on anything to do with minors and sex. Of course our laws are completely illogical - the fact that you could be put in prison and a sex offender registry for life for having a single photo of a naked 17 year old (how in the hell were you supposed to know?) on your device is ridiculous. But, again, who is going to decide to put forward a bill to change that? It's all risk and no reward for the politician. | |
| ▲ | Spooky23 5 hours ago | parent | prev | next [-] | | Fair, but still, the legislative process takes alot of time, and judicial norms and precedent allow for discretion to be exercised with accountability, which also informs the legislative process. | |
| ▲ | fendy3002 6 hours ago | parent | prev | next [-] | | AI would be great IF they know what to find The state of current AI does not give them ability to know that, so the consideration is likely to be dropped | |
| ▲ | quantified 6 hours ago | parent | prev | next [-] | | Start fixing those bugs, you will open up can after can of worms. Finding the bugs- will be entertaining. | |
| ▲ | s1artibartfast 6 hours ago | parent | prev [-] | | now you are talking about replacing not judges, but your elected representatives. |
| |
| ▲ | latchkey 3 hours ago | parent | prev | next [-] | | > where the "perpetrator" is a stupid teenager who took nude pics of themselves and sent them to their boy/girlfriend. "Where the "perpetrator" is a stupid teenager who took nude pics of themselves and sent them to their boy/girlfriend. If you were a US court judge, what would your opinion be on that case?" I was pretty happy with the results and it clearly wasn't tripped up by the non-sequitur. | |
| ▲ | contrarian1234 6 hours ago | parent | prev | next [-] | | Sorry but that seems like an insane system where whole classes of actions effectively are illegal but probably okay if you're likeable. In your scenario the obvious solution is to amend the law and pardon people convinced under it. B/c what really happens is that if you have a pretty face and big tits you get out of speeding tickets b/c "gosh well the law wasn't intended for nice people like you" | | |
| ▲ | scottLobster 5 hours ago | parent | next [-] | | It isn't "my scenario". These are real cases. https://www.aclu-mn.org/press-releases/victory-judge-dismiss... "In his decision, Judge Cajacob asserts that the purpose and intent of Minnesota’s child pornography statute does not support punishing Jane Doe for explicit images of herself and doing so “produces an absurd, unreasonable, and unjust result that utterly confounds the statue’s stated purpose.”" Nothing in there about "likeability" or "we let her off because she had nice tits" (which would be particularly weird in this case). Judges have a degree of discretion to interpret laws, they still have to justify their decisions. If you think the judge is wrong then you can appeal. This is how the law has always worked, and if you've thought otherwise then consider you've been living under this "insane system" for your entire life, and every generation of ancestors has too, assuming you're/they've been in the US. | | |
| ▲ | contrarian1234 32 minutes ago | parent [-] | | > It isn't "my scenario". These are real cases maybe English isnt your native language, but "scenario" doesnt require the situation to be not real > Nothing in there about "likeability" or "we let her off because she had nice tits" We have no way to know if likeability played in to it. When rules are bendable then they are bent to the likeable and attractive. My example of a traffic stop is analogous and more directly relatable > This is how the law has always worked, and if you've thought otherwise then consider you've been living under this "insane system" for your entire life You seem to have some reading comprehension issues.. I never suggested its not currently working that way and i never suggested the current situation is not insane. If you think the current system is sane and great then thats your opinion Everyone i know whos had to deal with the US legal system has only related horror stories |
| |
| ▲ | miffy900 6 hours ago | parent | prev [-] | | Are you even responding to the right comment? I read your comment and the parent comment you've responded to and this response doesn't make sense - it reads like a non-sequitur. | | |
| ▲ | contrarian1234 5 hours ago | parent | next [-] | | The parent comment present a scenario where the law is ignored b/c the judge decides for himself it shouldn't apply. I'm pointing out that this kind of approach is fundamentally unjust and wrong. "And sure you can say the laws should be written better, but so long as the laws are written by humans that will simply not be the case" The obvious solution is dismissed | | |
| ▲ | scottLobster 5 hours ago | parent [-] | | Are you a bot? Your name is contrarian1234 and you lack sophisticated interpretations of statements. | | |
| ▲ | contrarian1234 15 minutes ago | parent [-] | | given your inability to engage with an opposing point of view, youre definitely not a bot. So ill take your ad hominem as praise |
|
| |
| ▲ | Spooky23 5 hours ago | parent | prev [-] | | People like this don’t let the facts get in the way. |
|
| |
| ▲ | throwaway894345 7 hours ago | parent | prev | next [-] | | Maybe we should compare AI to legislators…? | |
| ▲ | rco8786 7 hours ago | parent | prev [-] | | I don't know if I'm comfortable with any of this at all, but seems like having AI do "front line" judgments with a thinner appeals layer available powered by human judges would catch those edge cases pretty well. | | |
| ▲ | arctic-true 7 hours ago | parent | next [-] | | This is basically how the administrative courts work now - an ALJ takes a first pass at your case, and then you can complain about it to a district court, who can review it without doing their own fact-finding. But the reason we can do this is that we trust ALJs (and all trial-level judges, as well as juries) to make good assessments on the credibility of evidence and testimony, a competency I don’t suspect folks are ready or willing to hand over to AI. | |
| ▲ | conradev 7 hours ago | parent | prev | next [-] | | The courts already have algorithmic oracles for specific judgements, like sentencing: https://en.wikipedia.org/wiki/COMPAS_(software) | |
| ▲ | jagged-chisel 7 hours ago | parent | prev | next [-] | | I don't follow your reasoning at all. Without a specific input stating that you can't be your own victim, how would the AI catch this? In what cases does that specific input even make sense? Attempted suicide removes one's own autonomy in the eyes of the law in many ways in our world - would the aforementioned specific input negate appropriate decisions about said autonomy? I don't see how an AI / LLM can cope with this correctly. | |
| ▲ | Lerc 7 hours ago | parent | prev | next [-] | | When discussing AI regulation, when I asked that they thought there should be a mechanism to appeal any determination made by an AI they had said that they had been advocating for that to go both ways, that people should be able to ask for an AI review of human made decisions and in the event of an inconsistency the issue is raised at a higher level. | |
| ▲ | gambiting 7 hours ago | parent | prev | next [-] | | To get to an appeal means you obviously already have a judgement against you - and as you can imagine in the cases like the one above that's enough to ruin your life completely and forever, even if you win on appeal. | |
| ▲ | 7 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | qmmmur 7 hours ago | parent | prev [-] | | Because historically appeal systems are well crafted and efficient? Please... at least read your comment out loud to yourself. |
|
|
|
| ▲ | deepsun 7 hours ago | parent | prev | next [-] |
| The main job of a judicial system is to appear just to people. As long as people think it's just -- everyone is happy. But if it's strictly by the law, but people consider it's unjust -- revolutions happen. In both cases, lawmakers must adapt the law to reflect what people think is "just". That's why there are jury duty in some countries -- to involve people to the ruling, so they see it's just. |
| |
| ▲ | toolslive 7 hours ago | parent | next [-] | | Being just (as in the right thing happened) and being legal (as in the judicial system does not object) are 2 totally different things. They overlap, but less than people would like to believe. | |
| ▲ | godelski 4 hours ago | parent | prev | next [-] | | > to appear just to people.
The best way to appear just is to be just.But I'm not sure what your argument is. It is our duty as citizens to encourage the system to be just. Since there is no concrete mathematical objective definition of justice, well, then... all we can work with is the appearance. So I don't think your insight is so much based on some diabolical deep state thinking but more on the limitations of practicality. Your thesis holds true if everyone is trying their best to be just. | |
| ▲ | jfengel 6 hours ago | parent | prev | next [-] | | I've never met a lawyer who believes that. To a lawyer, justice requires agreement on the laws, rather than individual notions of justice. If the law is unjust, it's up to the lawmaking body to fix that. I hear this from lawyers of all ideologies. I believe that this is absurd, but I'm not a lawyer. | | |
| ▲ | wahern 6 hours ago | parent [-] | | In Federal courts mandatory minimum sentences were judged to be unconstitutional, as the ability to individualize sentencing was considered a prerogative intrinsic to the role of [Federal] judges. Though, a judge cannot impose a sentence greater than the maximum allowed under law. Federal courts still have sentencing guidelines that are almost always applied, but strictly speaking they're advisory. More fundamentally, individualized justice is a core principle of common law courts, at least historically speaking. It's also an obscure principle, but you can't fully understand the system without it, including the wide latitude judges often wield in various (albeit usually highly technical) aspects of their job. |
| |
| ▲ | rootusrootus 7 hours ago | parent | prev | next [-] | | > The main job of a judicial system is to appear just to people. Agree 100%. This is also the only form of argument in favor of capital punishment that has ever made me stop and think about my stance. I.e. we have capital punishment because without it we may get vigilante justice that is much worse. Now, whether that's how it would actually play out is a different discussion, but it did make me stop and think for a moment about the purpose of a justice system. | | |
| ▲ | andyferris 7 hours ago | parent [-] | | I’ve never heard of vigilante justice against someone already sentenced to prison for life, just because they were sentenced in a place without capital punishment? (I mean - people get killed in prison sometimes, I suppose, but it’s not really like vigilante justice on the streets is causing a breakdown in society in Australia, say…) | | |
| ▲ | shiroiuma 5 hours ago | parent [-] | | It's probably rather difficult and risky to enact vigilante justice against someone who's in prison. I think the problem is with places where they don't have life sentences at all, but rather let murderers back out into society after some time. I don't know if vigilante justice is a problem there in reality, but at least I can see it as a possibility: someone might still be angry that you murdered their relative after 20 years and come kill you when you're released. | | |
| ▲ | quadtree 5 hours ago | parent [-] | | The reference to vigilante justice may be about killing a suspect before they're imprisoned or even tried, such as when a mob storms the local jail. The theory is, if people believe only death can bring justice, and the state doesn't have the death penalty, then the vigilantes will take matters into their own hands. Ergo, the state should have the death penalty. Having recently done an in-depth review of arguments for and against the death penalty,[1] I can say that this argument is not prominent in the discourse. [1]: https://fairmind.org/guides/death-penalty | | |
| ▲ | shiroiuma 4 hours ago | parent [-] | | I see; this makes more sense. It's a little hard to imagine these days though, but ages ago, mobs storming the local jail and hanging a suspect wasn't that uncommon. | | |
|
|
|
| |
| ▲ | raw_anon_1111 3 hours ago | parent | prev [-] | | No revolution only happens when the law is unjust to people who are in their same tribe… |
|
|
| ▲ | bawolff 6 hours ago | parent | prev | next [-] |
| > Judges do what their name implies - make judgment calls. I find it re-assuring that judges get different answers under different scenarios, because it means they are listening and making judgment calls. I disagree - law should be the same for everyone. Yes sometimes crimes have mitigating curcumstances and those should be taken into account. However that seems like a separate question of what is and is not illegal. |
| |
| ▲ | sarchertech 6 hours ago | parent | next [-] | | Laws are written to be interpreted and applied by humans. They aren’t computer programs. They are full of ambiguity. Much of this is by design because there are too many possible edge cases to design a fully algorithmic unambiguous legal system. | | |
| ▲ | bawolff an hour ago | parent [-] | | True, but its not a free for all. Judges (especially in a common law juridsiction) are supposed to be consistent and interpret laws following certain principles. There are more right and less right interpretations - thus we can grade judges on how well they do their job. |
| |
| ▲ | NoahZuniga 6 hours ago | parent | prev | next [-] | | The thing is, Laws do not forsee in all cases, and language is not completely objective, so you cannot avoid judgement calls. One example is computer hacking, which in many jurisdictions is specified in very vague terms. | | |
| ▲ | NoahZuniga 6 hours ago | parent [-] | | Another example is that in the Netherlands, there's a crime called "valsheid in geschriften" which exists to make it easy to prosecute fraud. It states that if you create a document with false information with the intent to use that document to deceive, you can get up to 5 years of jail time or some really big fine. Is lying on a paper insurance form to get a cheaper premium breaking this law? This doesn't seem clear cut to me. | | |
| ▲ | 6 hours ago | parent | next [-] | | [deleted] | |
| ▲ | thaumasiotes 3 hours ago | parent | prev [-] | | > This doesn't seem clear cut to me. ...why not? By your wording, that would be one of the clearest-cut legal cases you could imagine. |
|
| |
| ▲ | matheusmoreira 6 hours ago | parent | prev | next [-] | | > law should be the same for everyone Nah. Too often their "crimes" are actually basic freedoms that they just find it profitable to deny. So many laws are bought and paid for by corporations. There is no need to respect them or even recognize them as legitimate, let alone make them universal. | |
| ▲ | cucumber3732842 6 hours ago | parent | prev | next [-] | | The law is rife with words and phrasing that make legality dependent upon those subjective mitigating factors. | |
| ▲ | 6 hours ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | swalsh 8 hours ago | parent | prev | next [-] |
| I believed that too until I watched the Karen Read Trials. The judge had a bias, and it was clear karen got justice despite the judge trying to put her finger on the scale. |
|
| ▲ | snitty 6 hours ago | parent | prev | next [-] |
| So here the test was effectively given a set of relevant facts, can we influence the way a judge (or LLM) rules based on superfluous facts. The judges were either confused or swayed by the superfluous facts. The LLM was not. The matter was one where the outcome should have been determinative, not judgment-based, under US law. |
|
| ▲ | tylervigen 8 hours ago | parent | prev | next [-] |
| Yes, your view is commonly called "legal realism." |
|
| ▲ | raw_anon_1111 4 hours ago | parent | prev | next [-] |
| You have a lot more faith in judges not being biased than I do. I’m about to say something that really makes me throw up a little in my mouth because it harkens back to the forced banal DEI training I had to suffer through in 2020 at BigTech [1]… But judges have all sorts of biases both conscious and unconscious. Where little Jacob will get in trouble for mischief and little Jerome will do the same thing and Jacob is just “a kid being a kid”. But little Jerome is “a thug in training who we need to protect society from”. [1] yes I’m well aware that biases exist. Not only did my still living parents grow up in the Jim Crow South. We had a house built in an infamous what was a “sundown town” as recently as 1990. We have seen how quickly the BS corporate concern was just marketing when it was convenient. |
|
| ▲ | vjulian 7 hours ago | parent | prev | next [-] |
| The legal system leaves much to be desired in relation to fairness and equity. I’d much prefer a multi-staged approach with an 1) AI analysis, 2) judge review with high bar for analysis if in disagreement with the AI, 3) public availability of the deliberations, 4) an appeals process. |
| |
| ▲ | jagged-chisel 7 hours ago | parent | next [-] | | Even having a ready-made determination by an AI runs the risk of prejudicing judges and juries. | | |
| ▲ | lemming 6 hours ago | parent | next [-] | | Given TFA, it seems that having human determinations involved might run the risk of prejudicing the AI. | |
| ▲ | arctic-true 7 hours ago | parent | prev [-] | | “Ladies and gentlemen of the jury, I actually asked ChatGPT and it said my client is not guilty.” |
| |
| ▲ | 7 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | qotgalaxy 7 hours ago | parent | prev [-] | | [dead] |
|
|
| ▲ | droidjj 8 hours ago | parent | prev | next [-] |
| Whether it’s reassuring depends on your judicial philosophy, which is partly why this is so interesting. |
|
| ▲ | godelski 4 hours ago | parent | prev | next [-] |
| IANAL. One thing I like to say is There is no rule that can be written so precisely that there are no exceptions, including this one.
A joke[0], but one I think people should take seriously. Law would be easy if it weren't for all the edge cases. Most of the things in the world would be easy if it weren't for all the edge cases[1]. This can be seen just by contemplating whatever domain you feel you have achieved mastery over and have worked with for years. You likely don't actually feel you have achieved mastery because you're developed to the point where you know there is so much you don't know[2].The reason I wouldn't want an LLM judge (or any algorithmic judge) is the same reason I despise bureaucracy. Bureaucracy fucks everything up because it makes the naive assumption that you can figure everything out from a spreadsheet. It is the equivalent of trying to plan a city from the view out of an airplane window. The perspective has some utility, but it is also disconnected from reality. I'd also say that this feature of the world is part of what created us and made us the way we are. Humans are so successful because of our adaptability. If this wasn't a useful feature we'd have become far more robotic because it would be a much easier thing for biology to optimize. So when people say bureaucracies are dehumanizing, I take it quite literally. There's utility to it, but its utility leads to its overuse and the bias is clear that it is much harder to "de"-implement something than to implement it. We should strongly consider that bias in society when making large decisions like implementing algorithmic judges. I'm sure they can be helpful in the courtroom, but to abdicate our judgements to them only results in a dehumanized justice system. There are multiple literal interpretations of that claim too. [0] You didn't look at my name, did you? [1] https://news.ycombinator.com/item?id=43087779 [2] Hell, I have a PhD and I forget I'm an expert in my domain because there's just so much I don't know I continue to feel pretty dumb (which is also a driving force to continue learning). |
|
| ▲ | fluidcruft 7 hours ago | parent | prev | next [-] |
| There are findings of fact (what happened, context) and findings of law (what does the law mean given the facts). I don't think inconsistentcy in findings of law is acceptable, really. If laws are bad fix the laws or have precident applied uniformly rather than have individual random judges invent new laws from the bench. Sentencing is a different thing. |
| |
| ▲ | Nursie 6 hours ago | parent | next [-] | | Leeway for human interpretation of laws is not a bug, it's a feature. It doesn't make things bad laws. This was the whole problem with the ludicrous "code is law!" movement a handful of years ago. No, it's not, law is made for people, life is imprecise and fairness and decency are not easy to encode. | |
| ▲ | 6 hours ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | ralusek an hour ago | parent | prev | next [-] |
| Disagree completely. Judgement of the sort you're describing should be done at the legislative phase (i.e. writing code). Inconsistent execution/application of the law is how bias happens. If a judgement done to the letter of the law feels unjust to you, change the letter of the law. |
|
| ▲ | homeonthemtn 6 hours ago | parent | prev | next [-] |
| I don't think a lot of people understand the grueling nature of a judge. Day in and out of cases over years are going to generate bias in the judge in one form or another. I wouldn't mind an AI check* to help them check that bias *A magically thorough, secure, and well tested AI |
|
| ▲ | latchkey 8 hours ago | parent | prev | next [-] |
| In 30 seconds, did the entire corpus of all the legal cases since the dawn of time agree with the judges opinion on my case? For the state of things in AI today, I'll take it as a great second opinion. |
| |
| ▲ | doctorpangloss 3 hours ago | parent [-] | | the LLMs are phenomenal judges, i am surprised people are skeptical of this result. their training regime is really similar to what a judge does. the reason people are talking about this is because they want AI LAWYERS, which is different than AI JUDGES. |
|
|
| ▲ | gowld 8 hours ago | parent | prev | next [-] |
| A mistake isn't "judgment". These were technical rulings on matters of jurisdiction, not subjective judgments on fairness. "The consistency in legal compliance from GPT, irrespective of the selected forum, differs significantly from judges, who were more likely to follow the law under the rule than the standard (though not at a statistically significant level). The judges’ behavior in this experiment is consistent with the conventional wisdom that judges are generally more restrained by rules than they are by standards. Even when judges benefit from rules, however, they make errors while GPT does not. |
|
| ▲ | qwertox 8 hours ago | parent | prev [-] |
| > If LLMs give only one answer, no matter what nuances are at play, that sounds like they are failing to judge and instead are diminishing the thought process down to black-and-white thinking. You can have a team of agents exchange views and maybe the protocol would even allow for settling the cases automatically. The more agents you have, the higher the nuances. |
| |
| ▲ | jagged-chisel 7 hours ago | parent | next [-] | | Presumably all these agents would have been trained on different data, with different viewpoints? Otherwise, what makes them different enough from each other that such a "conversation" would matter? | | |
| ▲ | qwertox 7 hours ago | parent [-] | | Different skills or plugins, different views and different tools for the analysis of the same object. Then the debate starts. |
| |
| ▲ | viraptor 6 hours ago | parent | prev [-] | | Then you'd need to provide them with access to the law, previous cases, to the news, to various data sources. And you'd have to decide how much each of those sources of information matter. And at that point, you've got people making the decision again instead of the ai in practice. And then there's the question of the model used. Turns out I've got preferences for which model I'd rather be judged by, and it's not Grok for example... |
|