| |
| ▲ | krisoft 2 hours ago | parent | next [-] | | > To reach mass adoption, self-driving car need to kill one every, say, billion miles. Important correction “kill one or less, per billion miles”. Before someone reluctantly engineers an intentional sacrifice to meet their quota. | |
| ▲ | JumpCrisscross 3 hours ago | parent | prev | next [-] | | > to reach mass adoption, self-driving car need to kill one every, say, billion miles They need to be around parity. So a death every 100mm miles or so. The number of folks who want radically more safety are about balanced by those who want a product in market quicker. | | |
| ▲ | ncallaway 2 hours ago | parent | next [-] | | > They need to be around parity. I don't think so. The deaths from self-driving accidents will look _strange_ and _inhuman_ to most people. The negative PR from self-driving accidents will be much worse for every single fatal collision than a human driven fatality. I think these things genuinely need to be significantly safer for society to be willing to tolerate the accidents that do happen. Maybe not a full order of magnitude safer, but I think it will need to be clearly safer than human drivers and not just at parity. | | |
| ▲ | JumpCrisscross 2 hours ago | parent | next [-] | | > negative PR from self-driving accidents will be much worse for every single fatal collision than a human driven fatality We're speaking in hypotheticals about stuff that has already happened. > I think these things genuinely need to be significantly safer for society to be willing to tolerate the accidents that do happen I used to as well. And no doubt, some populations will take this view. They won't have a stake in how self-driving cars are built and regulated. There is too much competition between U.S. states and China. Waymo was born in Arizona and is no growing up in California and Florida. Tesla is being shaped by Texas. The moment Tesla or BYD get their shit together, we'll probably see federal preëmption. (Contrast this with AI, where local concerns around e.g. power and water demand attention. Highways, on the other hand, are federally owned. And D.C. exerting local pressure with one hand while holding highway funds in the other is long precedented.) | |
| ▲ | WarmWash an hour ago | parent | prev | next [-] | | I know this sounds bad, but I wonder if you put an LLM in the vehicle that can control basic stuff (like the radio, climate controls, windows, change destination, maybe friendly chatter) but no actual vehicle control, people will humanize the car and be much more forgiving of mistakes. I feel pretty certain that they would.. | |
| ▲ | Terr_ 2 hours ago | parent | prev | next [-] | | > The deaths from self-driving accidents will look _strange_ and _inhuman_ to most people. I like to quip that error-rate is not the same as error-shape. A lower rate isn't actually better if it means problems that "escape" our usual guardrails and backup plans and remedies. You're right that some of it may just be a perception-issue, but IMO any "alien" pattern of failures indicates that there's a meta-problem we need to fix, either in the weird system or in the matrix of other systems around it. Predictability is a feature in and of itself. | |
| ▲ | 2 hours ago | parent | prev [-] | | [deleted] |
| |
| ▲ | michaelt 2 hours ago | parent | prev | next [-] | | About half of road deaths involve drivers who are drunk or high. But only a very small fraction of drivers drive drunk or high - 50% of deaths are caused by 2% of drivers. A self-driving car that merely achieves parity would be worse than 98% of the population. Gotta do twice the accident-free mileage to achieve parity with the sober 98%. | |
| ▲ | rootusrootus 2 hours ago | parent | prev [-] | | I disagree. The 1:100M statistic is too broad, and includes many extremely unsafe drivers. If we restrict our data to only people who drive sober, during normal weather conditions, no speed racing or other deliberately unsafe choices, what is the expected number of miles per fatality? 1 in a billion might be a conservative target. I can appreciate that statistically, reaching parity should be a net improvement over the status quo, but that only works if we somehow force 100% adoption. In the meantime, my choice to use a self-driving car has to assess its risk compared to my driving, not the drunk's. | | |
| ▲ | ufmace an hour ago | parent | next [-] | | This gets near something I was thinking about. Most of the numbers seem to assume that injuries, injury severity, and deaths are all some fixed proportion of each other. But is that really true in the context of self-driving cars of all types? It seems reasonable that the deaths and major injuries come highly disproportionally from excessively high speed, slow reaction times at such speeds, going much too fast for conditions even at lower absolute speeds. What if even the not very good self-driving cars are much better at avoiding the base conditions that result in accidents leading to deaths, even if they aren't so good at avoiding lower-speed fender-benders? If that were true, what would that mean to our adoption of them? Maybe even the less-great ones are better overall. Especially if the cars are owned by the company, so the costs of any such minor fender-benders are all on them. If that's the case, maybe Tesla's camera-only system is fairly good actually, especially if it saves enough money to make them more widespread. Or maybe Waymo will get the costs of their more advanced sensors down faster and they'll end up more economical overall first. They certainly seem to be doing better at getting bigger faster in any case. | |
| ▲ | JumpCrisscross 2 hours ago | parent | prev [-] | | > I disagree. The 1:100M statistic is too broad, and includes many extremely unsafe drivers To be clear, I'm not arguing for what it should be. I'm arguing for what it is. I tend to drive the speed limit. I think more people should. I also recognise there is no public support for ticketing folks going 5 over. > my choice to use a self-driving car has to assess its risk compared to my driving, not the drunk's All of these services are supply constrained. That's why I've revised my hypothesis. There are enough folks who will take that car before you get comfortable who will make it lucrative to fill streets with them. (And to be clear, I'll ride in a Waymo or a Cybercab. I won't book a ride with a friend or my pets in the latter.) |
|
| |
| ▲ | onlyrealcuzzo 2 hours ago | parent | prev | next [-] | | Almost - fatalities are obviously important, but not the only metric. You can prove Tesla's system is a joke with a magnitude of metrics. | |
| ▲ | WarmWash 2 hours ago | parent | prev [-] | | A death is a catastrophic case, but even a mild collision with bumps and bruises to the people involved would set back Tesla years. People have an expectation that self driving cars will be magical in ability. Look at the flac waymo has received despite it's most egregious violations being fender bender equivalents |
|