| ▲ | dmix 5 hours ago |
| What was the better solution here then? Assuming there's hundreds or thousands of self-driving cars suddenly driving in environment without any traffic lights. In the pictures you can see six Waymo cars at a single intersection. Assuming some of them had passengers should they all try to turn at the intersection anyway, when their LIDAR says the lane is likely free and pull over to the side? Is that the safest option? Should there be human police to direct the self driving cars through intersections? Or wait out the temporary electricity failure? I believe the answer is far more complicated than it seems and in practice having the cars stay still might have been the safest option any of the parties could agree on (Waymo's office, the city traffic people, state regulators, etc). There are people thinking this stuff out and those cars can 100% pull over automatically but an explicit choice was made not to do so for safety. |
|
| ▲ | MBCook 5 hours ago | parent | next [-] |
| I think part of the problem is they’ve made it our problem. Look I like Waymo. I think they’re neat and I trust them far more than any of the other companies. But in my mind being able to handle stuff like this is just a requirement to be on the roads in any non-trivial number. Like if they had two vehicles in this happened then OK that’s a problem but it was two vehicles in an entire city. When you have enough on the road that you can randomly have six at one intersection you should absolutely be able to handle this by then. I want them to do good. I want them to succeed. But just like airliners this is the kind of thing where people’s safety comes first. What we saw happen looks like the safety of the Waymo and its passengers came above everyone else despite having no need to do that. There are certainly some situations where just staying put is the best decision. The power went out and there are no other hazards on the road is not one of them. They made things worse for everyone else on average in a foreseeable situation where it was totally unnecessary. And that’s not OK with me. This feels like the kind of thing that absolutely should’ve been tested extremely well by now. Before they were allowed to drive in large volumes. |
| |
| ▲ | macintux 4 hours ago | parent [-] | | Effectively they’ve turned any edge case into a potential city-wide problem and PR nightmare. One driver doesn’t know how to handle a power outage? It’s not news. Hundreds of automated vehicles all experience the same failure? National news. | | |
| ▲ | scoofy 2 hours ago | parent | next [-] | | I live in the affected neighborhood. There were hundreds of drivers that did not know how to handle a power outage... it was a minority of drivers, but it was a nontrivial, but nominally large number. I even saw a Muni bus blow through a blacked out intersection. The difference is the Waymos failed in a way that prevented potential injury, whereas the humans who failed, all fail in a way that would create potential injury. I wish the Waymos handled it better, yes, but I think that the failure state they took is preferable to the alternative. | | |
| ▲ | Dylan16807 33 minutes ago | parent [-] | | Locking down the roads creates a lot of potential injuries too. And "don't blow through an intersection with dead lights" is super easy to program. That's not enough for me to forgive them of all that much misbehavior. | | |
| ▲ | scoofy 16 minutes ago | parent [-] | | > is super easy to program What?!? We’re talking about autonomous vehicles here. |
|
| |
| ▲ | MBCook 4 hours ago | parent | prev | next [-] | | Right. You know there are humans somewhere in the city who got confused or scared and mess up too. Maybe a young driver who is barely confident in the first place on a temporary permit, or just someone who doesn’t remember what you do and was already over-stressed. Whatever, it happens. This was a (totally unintentional) coordinated screw up causing problems all over as opposed to one small spot. The scale makes all the difference. | |
| ▲ | scottbez1 2 hours ago | parent | prev [-] | | Yeah, the correlated risk with AVs is a pretty serious concern. And not just in emergencies where they can easily DDOS the roads, but even things like widespread weaknesses or edge cases in their perception models can cause really weird and disturbing outcomes. Imagine a model that works real well for detecting cars and adults but routinely misses children; you could end up with cars that are 1/10th as deadly to adults but 2x as deadly to children. Yes, in this hypothetical it saves lives overall, but is it actually a societal good? In some ways yes, in some ways it should never be allowed on any roads at all. It’s one of the reasons aggregated metrics on safety are so important to scrutinize. |
|
|
|
| ▲ | wiml 5 hours ago | parent | prev | next [-] |
| We already have a solution, it's written down in the traffic laws. If the signals fail, treat the intersection roughly like a four-way stop. Everybody learns this in drivers' ed. It's not obscure. If the cars can't follow traffic rules maybe they're not ready to be on the streets unsupervised. |
| |
| ▲ | bsder 4 hours ago | parent | next [-] | | The problem seems to be that the Waymo cars did exactly as you requested and treated the intersections like 4 way stops but kept getting displaced by more aggressive drivers who simply slowed and rolled. How many non-Waymo accidents happened at intersections during this time? I suspect more than zero given my experiences with other drivers when traffic lights go off. Apparently, Waymo's numbers are zero so humans are gonna lose this one. The problem here is that safety and throughput are at odds. Waymo chose safety while most drivers chose throughput. Had Waymo been more aggressive and gotten into an accident because it wouldn't give way, we'd have headlines about that, too. The biggest obstacle to self-driving is the fact that a lot of driving consists of knowing when to break the law. | | |
| ▲ | MBCook 4 hours ago | parent [-] | | > The problem here is that safety and throughput are at odds. Waymo chose safety while most drivers chose throughput. Did they? They chose their safety. I suspect the net effect of their behavior made the safety of everyone worse. They did such a bad job of handling it people had to go around them, making things less safe. We know what people are like. Not everyone is OK doing 2-3 mph for extended time waiting for a Waymo to feel “safe”. Operating in a way that causes large numbers of other drivers to feel the need to bypass you is fundamentally worse. | | |
| ▲ | bsder 9 minutes ago | parent [-] | | > Did they? They chose their safety. I suspect the net effect of their behavior made the safety of everyone worse. There is no viable choice other than prioritizing the safety of your rider. Anything less would be grounds for both lawsuits and reputational death. The fact that everybody else chose throughput over safety is not the fault of Waymo. Will you also complain when enough Waymo cars start running on the freeways that a couple of them in a row can effectively enforce following distances and speed limits, for example? |
|
| |
| ▲ | dmix 5 hours ago | parent | prev [-] | | That may be the rules for humans, particuarly people who are always in a rush and won't stay still anyway. With a major intersection turned four-way stop you have lots of humans making very complex decisions and taking a lot of personal risk. If multiple self driving cars make the choice at the wrong time you could jam up an intersection and create a worse traffic issue, or kill a passenger. It's all a careful risk calculation, those self driving cars need to determine if it's safe to continue through an intersection without the traffic lights their computers spent millions of hours to train on (likewise with humans). That's a tough choice for a highly regulated/insured company running thousands of cars. If anything, their programming should only take such a risk to move out of the way for a fire truck/ambulance. | | |
| ▲ | markdown an hour ago | parent [-] | | > If multiple self driving cars make the choice at the wrong time Would would they do that? It's a hive, isn't it? |
|
|
|
| ▲ | dragonwriter an hour ago | parent | prev | next [-] |
| > Assuming there's hundreds or thousands of self-driving cars suddenly driving in environment without any traffic lights. Self-driving cars should (1) know how to handle stops, and (2) know that the rules for a failed traffic light (or one flashing red) are those for an all-way stop. |
|
| ▲ | autoexec 42 minutes ago | parent | prev | next [-] |
| > What was the better solution here then? Just pulling over and getting out of the way really would help. There's no reason a human couldn't do the same safely. Not beta testing your cards on public roads would really be ideal. Especially without human drivers ready to take over. |
|
| ▲ | wrsh07 an hour ago | parent | prev | next [-] |
| Tbh I'm surprised waymo didn't have remote monitors who could handle cars at intersections or safely pull to the side |
|
| ▲ | pinnochio 5 hours ago | parent | prev | next [-] |
| Uh, how about having their remote driver staff take over? > but an explicit choice was made not to do so for safety. You know this how? |
| |
| ▲ | MBCook 5 hours ago | parent | next [-] | | That’s what they usually do. The assumption here is that due to the blackout or some other related issue the human drivers were unavailable. However even if that’s not true if they have more cars than human drivers there’s gonna be a problem until they work through the queue. And the bigger that ratio, the longer it will take. | | |
| ▲ | autoexec 36 minutes ago | parent [-] | | I guess that in a blackout they should just have the cars park somewhere safely. Maybe it'd be best to never have more cars on the road than assisting/available human drivers. As soon as no human drivers are available to take over for outage/staffing/whatever reason all cars should just pull over and stop. |
| |
| ▲ | bink 5 hours ago | parent | prev [-] | | This only works if they have cell service and enough human drivers to handle all of their cars. |
|
|
| ▲ | ethanwillis 3 hours ago | parent | prev [-] |
| The better solution? To not fetishize technology. |