| ▲ | jerlam 14 hours ago | |||||||
I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles. I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors): https://electrek.co/2025/03/23/tesla-full-self-driving-stagn... | ||||||||
| ▲ | sowbug 14 hours ago | parent [-] | |||||||
And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs | ||||||||
| ||||||||