| ▲ | terminalshort 5 hours ago | ||||||||||||||||
If there was actually a rate of one life threatening accident per 10,000 miles with FSD that would be so obvious it would be impossible to hide. So I have to conclude the cars are actually much safer than that. | |||||||||||||||||
| ▲ | buran77 4 hours ago | parent | next [-] | ||||||||||||||||
FSD never drives alone. It's always supervised by another driver legally responsible to correct. More importantly we have no independently verified data about the self driving incidents. Quite the opposite, Tesla repeatedly obscured data or impeded investigations. I've made this comparison before but student drivers under instructor supervision (with secondary controls) also rarely crash. Are they the best drivers? I am not a plane pilot but I flew a plane many times while supervised by the pilot. Never took off, never landed, but also never crashed. Am I better than a real pilot or even in any way a competent one? | |||||||||||||||||
| ▲ | tippytippytango 5 hours ago | parent | prev [-] | ||||||||||||||||
Above I was talking more generally about full autonomy. I agree the combined human + fsd system can be at least as safe as a human driver, perhaps more, if you have a good driver. As a frequent user of FSD, it's unreliability can be a feature, it constantly reminds me it can't be fully trusted, so I shadow drive and pay full attention. So it's like having a second pair of eyes on the road. I worry that when it gets to 10,000 mile per incident reliability that it's going to be hard to remind myself I need to pay attention. At which point it becomes a de facto unsupervised system and its reliability falls to that of the autonomous system, rather than the reliability of human + autonomy, an enormous gap. Of course, I could be wrong. Which is why we need some trusted third party validation of these ideas. | |||||||||||||||||
| |||||||||||||||||