▲ | Mentlo 4 days ago | |
I came here to write the same comment you did. What I’d suspect (I don’t work in self driving but I do in AI) is the issue is that this mode of operation would happen more often than not as the sensors disagree in critical ways more often than you’d think. So going “safety first” every time likely critically diminishes UX. The issue is not recognising that optimising for Ux at the expense of safety here is the wrong call, motivated likely by optimism and a desire for autonomous cars, more than reasonable system design. I.e. if the sensors disagree so often that it makes the system unusable, maybe the solution is “we’re not ready for this kind of technology and we should slow down” rather than “let’s figure out non-UX breaking edge case heuristics to maintain the illusion of autonomous driving being behind the corner”. Part of this problem is not even technological - human drivers tradeoff safety for UX all the time - so the expectation for self driving is unrealistic and your system has to have the ethically unacceptable system configuration in order to have any chance of competing. Which is why - in my mind - it’s a fools endeavour in personal car space, but not in public transport space. So go waymo, boo tesla. |