| ▲ | JumpCrisscross 8 hours ago | ||||||||||||||||
> Does it matter Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.") If Waymo literally didn't foresee a blackout, that's a systemic problem. If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case. | |||||||||||||||||
| ▲ | andsoitis 6 hours ago | parent | next [-] | ||||||||||||||||
> > Does it matter > Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.") No, I'm not inferring internal processes. I'm guessing level of critical thinking. When you are creating autonomous vehicles, one of the things that you want to risk assess and have mitigation for is what you want the vehicles to do in case the systems they depend on fail (e.g. electricity, comms). Now, it could be that the team has anticipated those things but some other failure in their systems have caused vehicles to stop in the middle of intersections, blocking traffic (as per article). I'm super curious to learn more about what Waymo encountered and how they plan to up their game. | |||||||||||||||||
| |||||||||||||||||
| ▲ | lubujackson 4 hours ago | parent | prev | next [-] | ||||||||||||||||
The "coinciding problems" should be an assumption, not a edge case we reason away. Because black swan events are always going to have cascading issues - a big earthquake means lights out AND cell towers overloaded or out, not to mention debris in streets, etc. What they need is a "shit is fucked fallback" that cedes control. Maybe there is a special bluetooth command any police or ambulance can send if nearby, like clear the intersection/road. Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance. To techies and lawyers it may sound impossible, but for normal humans, that certainly sounds better. Like that Mitch Hedberg joke, when an escalator is out of order it becomes stairs. When a Waymo breaks it should become a car. | |||||||||||||||||
| |||||||||||||||||
| ▲ | MBCook 7 hours ago | parent | prev [-] | ||||||||||||||||
>If Waymo literally didn't foresee a blackout, that's a systemic problem. I agree with this bit > If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case. This is what I have a problem with. That’s not an edge case. There will always be a weird thing no one programmed for. Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that. It doesn’t matter. There has to be an absolute minimum fail safe that can always work if the car is capable of moving safely. The fact that a human driver couldn’t be reached to press a button to say to execute that is not acceptable. Not having the human available is a totally foreseeable problem. It’s Google. They know networks fail. | |||||||||||||||||
| |||||||||||||||||