▲ | ben_w 5 days ago | |||||||
Many accidents are caused by low-angle light dazzle. It's part if why high beams aren't meant to be used off a dual carriageway. When was the last time you saw a paper bag blown across the street and mistook it for a cat or a fox? (Did you even notice your mistake, or do you still think it was an animal?) Do you naturally drive faster on wide streets, slower on narrow streets, because the distance to the side of the road changes your subconcious feeling of how fast you're going? Do you even know, or are you limited to your memories rather than a dashcam whose footage can be reviewed later? etc. Now don't get me wrong, AI today is, I think, worse than humans at safe driving; but I'm not sure how much of that is that AI is more hallucinate-y than us vs. how much of it is that human vision system failures are a thing we compensate for (or even actively make use of) in the design of our roads, and the AI just makes different mistakes. | ||||||||
▲ | davidhs 5 days ago | parent [-] | |||||||
If the internal representation of Tesla Autopilot is similar to what the UI displays, i.e. the location of the w.r.t. to everything else, and we had a human whose internal representation is similar, everything jumping around in consciousness, we’d be insane to allow him to drive. Self-driving is probably “AI-hard” as you’d need extensive “world knowledge” and be able to reason about your environment and tolerate faulty sensors (the human eyes are super crappy with all kinds of things that obscure it, such as veins and floaters). Also, if the Waymo UI accurately represents what it thinks is going on “out there” it is surprisingly crappy. If your conscious experience was like that when you were driving you’d think you had been drugged. | ||||||||
|