Remix.run Logo
alkonaut 17 hours ago

And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.

I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

jillesvangurp 16 hours ago | parent | next [-]

I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.

As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.

Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.

Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.

Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.

alkonaut 16 hours ago | parent [-]

> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.

I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.

jillesvangurp 9 hours ago | parent | next [-]

Also fun to calculate how this compounds over say 40 years. You get to about 1 in 150 drivers being involved in some kind of deathly accident. People are really bad at numbers and assessing risk.

trillic 15 hours ago | parent | prev [-]

It will also never get worse. This is the worst the algorithms from this point forward.

jerlam 14 hours ago | parent | next [-]

I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.

I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):

https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...

sowbug 14 hours ago | parent [-]

And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs

jerlam 12 hours ago | parent [-]

There are already anecdotes of people aggressively jaywalking in front of a Waymo because they know it will stop, and people driving more aggressively around Waymos because it will always defer to them.

trollbridge 14 hours ago | parent | prev [-]

Has this been true of other Google products? They never get worse?

jonas21 16 hours ago | parent | prev | next [-]

> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

Do you mean like this?

https://waymo.com/safety/impact/

alkonaut 16 hours ago | parent [-]

Yes but ideally from some objective source.

xnx 14 hours ago | parent [-]

Like this? https://waymo.com/blog/2024/12/new-swiss-re-study-waymo

trollbridge 14 hours ago | parent [-]

Maybe an objective source that isn't on the waymo.com domain?

xnx 13 hours ago | parent [-]

"We find that when benchmarked against zip code-calibrated human baselines, the Waymo Driver significantly improves safety towards other road users."

https://pmc.ncbi.nlm.nih.gov/articles/PMC11305169/

WarmWash 16 hours ago | parent | prev | next [-]

If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.

I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.

recursive 15 hours ago | parent | next [-]

That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.

13 hours ago | parent | next [-]
[deleted]
crazygringo 15 hours ago | parent | prev [-]

If we adopted that level of risk, we'd have 5mph speed limits on every street with parking. As a society, we've decided that's overly cautious.

mhast 14 hours ago | parent [-]

But with waymos it would be possible. Mark those streets as "extremely slow" and never go there unless you are dropping someone off. (The computer has more patience than human drivers.)

If that's too annoying then bad parking by school areas so the situation doesn't happen.

crazygringo 14 hours ago | parent | next [-]

I don't know if you've been to some cities or neighborhoods but almost every street has on-street parking in many of them.

And why would you make Waymo's go slower than human drivers, when it's the human drivers with worse reaction times? I had interpreted the suggestion as applying to all drivers.

12 hours ago | parent | prev [-]
[deleted]
alkonaut 16 hours ago | parent | prev | next [-]

Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.

maerF0x0 15 hours ago | parent | prev | next [-]

> The situation of "kid running out between cars" will likley never be solved

Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.

One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.

toast0 15 hours ago | parent [-]

How do you add a chain link fence between the parked and driving cars for on-street parking?

maerF0x0 15 hours ago | parent [-]

there's still an inlet and outlet (kinda like hotel pickup/drop off loops). It's not absolutely perfect, but it constrains the space of where kids can dart from every parked car to 2 places.

Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.

toast0 14 hours ago | parent [-]

Ok, that's not really a simple change anymore, because you need more space for that. Unless it's really just a drop off queue, but then it's not parked cars, since a parked car blocks the queue.

We would really need to see the site to have an idea of the constraints, Santa Monica has some places where additional roadway can be accomodated and some places where that's not really an option.

xnx 14 hours ago | parent | prev [-]

Second-order benefit: More Waymos = fewer parked cars

recursive 14 hours ago | parent [-]

In high parking contention areas, I think there's enough latent demand for parking that you wouldn't observe fewer parked cars until reduce demand by a much greater amount.

Archio 14 hours ago | parent | prev | next [-]

>We accept the risks with humans because those humans accept risk.

It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?

trollbridge 14 hours ago | parent | next [-]

I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.

In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.

fragmede 9 hours ago | parent [-]

> Wherever I'm going, I'll be there to apply the formula. I'll keep the secret intact. It's simple arithmetic. It's a story problem. If a new car built by my company leaves Chicago traveling west at 60 miles per hour, and the rear differential locks up, and the car crashes and burns with everyone trapped inside, does my company initiate a recall?

> You take the population of vehicles in the field (A) and multiple it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C). A times B times C equals X. This is what it will cost if we don't initiate a recall. If X is greater than the cost of a recall, we recall the cars and no one gets hurt. If X is less than the cost of a recall, then we don't recall.

-Chuck Palahniuk, Fight Club

sowbug 14 hours ago | parent | prev | next [-]

Even in terms of plain results, I'd say the consequences-based system isn't working so well if it's producing 40,000 US deaths annually.

alkonaut 10 hours ago | parent [-]

That’s the fault of poor infrastructure and laws more than anything else. AV’s must drive in the same infrastructure (and can somewhat compensate).

alkonaut 10 hours ago | parent | prev [-]

Yes

criddell 16 hours ago | parent | prev | next [-]

Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?

alkonaut 16 hours ago | parent [-]

"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.

criddell 15 hours ago | parent | next [-]

If the current situation was every day 40 people die but blame is rarely assigned, would you recommend a change where an additional 10 people are going to die but someone will be held responsible for those deaths?

alkonaut 10 hours ago | parent [-]

Yes

crazygringo 15 hours ago | parent | prev | next [-]

People don't usually go to jail. Unless the driver is drunk or there's some other level of provable criminal negligence (or someone actively trying to kill people by e.g. driving into a crowd of protesters they disagree with), it's just chalked up as an accident.

zamadatix 14 hours ago | parent | prev | next [-]

Apart from a minority of car related deaths resulting in jail time, what kind of person wants many more people to die just so they can point at someone to blame for it? At what point are such people the ones to blame for so many deaths themselves?

simianwords 12 hours ago | parent | prev | next [-]

In such situations it’s useful to put yourself in a hypothetical situation. Rules: you can’t pick who you will be: one of the dead or alive. It will be assigned randomly.

So would you pick situation 1 or 2?

I would personally pick 1.

renewiltord 15 hours ago | parent | prev [-]

Do they go to jail?

That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...

The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.

This driver will be back on the street right away.

xnx 14 hours ago | parent [-]

Ugh. That is so despicable both of the driver and as a society that we accept this. Ubiquitous Waymo can't come soon enough.

jtrueb 17 hours ago | parent | prev | next [-]

Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.

17 hours ago | parent | prev | next [-]
[deleted]
lokar 17 hours ago | parent | prev | next [-]

I generally agree the bar is high.

But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.

There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.

cameldrv 17 hours ago | parent | prev | next [-]

That’s an incentive to reduce risk, but if you empirically show that the AV is even 10x safer, why wouldn’t you chalk that up as a win?

JumpCrisscross 14 hours ago | parent | prev | next [-]

> Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it

It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.

xnx 14 hours ago | parent | prev [-]

> Self driving needs to be orders of magnitude safer for us to acknowledge it

All data indicates that Waymo is ~10x safer so far.

"90% Fewer serious injury or worse crashes"

https://waymo.com/safety/impact/