| ▲ | LeifCarrotson 9 hours ago | |
Waymo immitates humans insofar as its neural net trained on avoiding collisions after millions of miles of video footage and LIDAR data on roads shared with humans causes it to immitate humans. It's likely manually programmed not to (incorrectly) turn the wheel to the left while stopped and waiting for an opportunity to turn. If you get rear-ended, you'll end up in the lane of oncoming traffic. It's certainly programmed to use its turn signals to indicate when it is going to turn. But after driving around thousands of cars without turn signals on but with their wheels pointed left, it "knows" to predict that they're about to turn, and might immitate humans by anticipating that action and moving to pass the stopped car on the right. | ||
| ▲ | Ferret7446 5 hours ago | parent [-] | |
> It's likely manually programmed not to (incorrectly) turn the wheel to the left while stopped and waiting for an opportunity to turn. I'm both surprised and not surprised that people do this. You'll hit the divider. | ||