| ▲ | Veserv 3 hours ago |
| It is important to note that this is with safety drivers. Professional driver + their most advanced "Robotaxi" FSD version under test with careful scrutiny is 4x worse than the average non-professional driver alone and averaging 57,000 miles per minor collision. Yet it is quite odd how Tesla also reports that untrained customers using old versions of FSD with outdated hardware average 1,500,000 miles per minor collision [1], a literal 3000% difference, when there are no penalties for incorrect reporting. [1] https://www.tesla.com/fsd/safety |
|
| ▲ | WarmWash 3 hours ago | parent | next [-] |
| Robotaxi supevision is just an emergency brake switch. Consumer supervision is having all the controls of the car right there in front of you. And if you are doing it right, you have your hands on wheel and foot on the pedals ready to jump in. |
| |
| ▲ | estearum 3 hours ago | parent | next [-] | | Nah the relevant factor, which has been obvious to anyone who cared to think about this stuff honestly for years, is that Tesla's safety claims on FSD are meaningless. Accident rates under traditional cruise control are also extremely below average. Why? Because people use cruise control (and FSD) under specific conditions. Namely: good ones! Ones where accidents already happen at a way below-average rate! Tesla has always been able to publish the data required to really understand performance, which would be normalized by age of vehicle and driving conditions. But they have not, for reasons that have always been obvious but are absolutely undeniable now. | | |
| ▲ | abtinf 2 hours ago | parent | next [-] | | Yup, after getting a Tesla with a free FSD trial period, it was obviously a death trap if used in any kind of slightly complex situation (like the highway on-ramp that was under construction for a year). At least once every few days, it would do something extremely dangerous, like try to drive straight into a concrete median at 40mph. The way I describe it is: yeah, it’s self-driving and doesn’t quite require the full attention of normal driving, but it still requires the same amount of attention as supervising a teenager in the first week of their learning permit. If Tesla were serious about FSD safety claims, they would release data on driver interventions per mile. Also, the language when turning on FSD in vehicle is just insulting—the whole thing about how if it were an iPhone app but shucks the lawyers are just so silly and conservative we have to call it beta. | | |
| ▲ | drob518 an hour ago | parent [-] | | > the same amount of attention as supervising a teenager in the first week of their learning permit. Yikes! I’d be a nervous wreck after just a couple of days. | | |
| ▲ | abtinf an hour ago | parent [-] | | You learn when it’s good and bad. It definitely has a “personality”. It is awesome in certain situations, like bumper to bumper traffic. I kept it for a couple months after the trial, but canceled because the situations it’s good at aren’t the situations I usually face when driving. |
|
| |
| ▲ | ToucanLoucan 2 hours ago | parent | prev [-] | | Also, if it actually worked, Tesla's marketing would literally never shut up about it because they have a working fully self-driving car. That would be the first, second, and third bullet point in all their marketing, and they would be right to do that. It's an incredible feature differentiator from all their competition. The only problem is, it doesn't work. | | |
| ▲ | bluGill 2 hours ago | parent | next [-] | | More importantly, we would have independent researchers looking at the data and commenting. I know this data exists, but I've never seen anyone who has the data and ability to understand it who doesn't also have a conflict of interest. | |
| ▲ | abtinf an hour ago | parent | prev [-] | | If it actually worked, Tesla would include an indemnity clause for all accidents while it’s active. |
|
| |
| ▲ | tzs 3 hours ago | parent | prev | next [-] | | > Robotaxi supevision is just an emergency brake switch That was the case when they first started the trial in Austin. The employee in the car was a safety monitor sitting in the front passenger seat with an emergency brake button. Later, when they started expanding the service area to include highways they moved them to the driver seat on those trips so that they can completely take over if something unsafe is happening. | | |
| ▲ | ssl-3 2 hours ago | parent [-] | | Interesting. I wonder if these newly-reported crashes happened with the employee positioned in e-brake or in co-pilot mode. | | |
| ▲ | tialaramex 32 minutes ago | parent [-] | | Humans are extremely bad at vigilance when nothing interesting is happening. Lookout is a life critical role on the railways you might be assigned as a track worker where your whole job is to watch for railway trains and alert your co-workers when one is coming, so they retreat to a safe position while it passes. That seems easy, and these are typically close friends, you work with them every day rotating roles, you'd certainly not want them injured or killed - but it turns out it's basically impossible to stay vigilant for more than an hour or two tops. Having insisted that you aren't tired, since you're just stood somewhere watching while your mates are working hard on the track, you nevertheless lose focus and oops, a train passes without your conscious awareness and your colleague dies or has a life-changing injury. This is awkward for any technologies where we've made it boring but not safe and so the humans must still supervise but we've made their job harder. Waymo understood that this is not a place worth getting to. |
|
| |
| ▲ | everdrive an hour ago | parent | prev | next [-] | | > And if you are doing it right, you have your hands on wheel and foot on the pedals ready to jump in. Seems like there's zero benefit to this, then. Being required to pay attention, but actually having nothing (ie, driving) to keep my engaged seems like the worst of both worlds. Your attention would constantly be drifting. | |
| ▲ | strangattractor an hour ago | parent | prev | next [-] | | Similarly Tesla using Teleoperators for their Optimus robots is a safety fake for robots that are not autonomous either. They are constantly trying to cover there inability to make autonomous anything. Cheap lidars or radar would have likely prevented those "hitting stationary objects" accidents. Just because the Furher says it does not make it so. | |
| ▲ | cma 2 hours ago | parent | prev | next [-] | | They had supervisors in the passenger seat for a whole but moved them back to the drivers seat, then moved some out to chase cars. In the ones where they are in driver seat they were able to take over the wheel weren't they? | |
| ▲ | Veserv 2 hours ago | parent | prev | next [-] | | So the trillion dollar company deployed 1 ton robots in unconstrained public spaces with inadequate safety data and chose to use objectively dangerous and unsafe testing protocols that objectively heightened risk to the public to meet marketing goals? That is worse and would generally be considered utterly depraved self-enrichment. | | |
| ▲ | Loughla 2 hours ago | parent [-] | | We also dump chemicals into the water, air, and soil that aren't great for us. Externalized risks and costs are essential for many business to operate. It isn't great, but it's true. Our lives are possible because of externalized costs. | | |
| ▲ | yndoendo an hour ago | parent | next [-] | | EU has one good regulation ... if safety can be engineered in it must be. OSAH also has regulation to mitigate risk ... tag and lock out. Both mitigate external risks. Good regulation mitigates known risk factors ... unknown take time to learn about. Apollo program learned this when the door locks were bolted on and the pure oxygen environment burned everyone alive inside. Safety first became the base of decision making. | |
| ▲ | Veserv an hour ago | parent | prev [-] | | Yes, those are bad as well. Are you seriously taking as your moral foundation that we need to poison the water supply to ensure executives get their bonuses? Is that somehow not utterly depraved self-enrichment? |
|
| |
| ▲ | UltraSane 3 hours ago | parent | prev [-] | | That just makes the Robotaxi even more irresponsible. | | |
| ▲ | foxyv 3 hours ago | parent [-] | | I think they were so used to defending Autopilot that they got confused. |
|
|
|
| ▲ | helsinkiandrew 3 hours ago | parent | prev | next [-] |
| To be fair to Tesla and other self driving taxis, urban and shorter journeys usually have worse collision rates than the average journey - and FSD is likely to be owners driving themselves to work etc. |
| |
| ▲ | Veserv 3 hours ago | parent | next [-] | | Great, we can use Tesla's own numbers once again by selecting non-highway. Average human is 178,000 non-highway miles per minor collision resulting in "Professional Driver + Most Advanced 'Robotaxi' FSD version under test with careful scrutiny" at 3x worse than the average non-professional driver alone. They advertise and market a safety claim of 986,000 non-highway miles per minor collision. They are claiming, risking the lives of their customers and the public, that their objectively inferior product with objectively worse deployment controls is 1,700% better than their most advanced product under careful controls and scrutiny when there are no penalties for incorrect reporting. | | |
| ▲ | jmcgough an hour ago | parent [-] | | Would be nice if we had a functioning legislative body that did more than pass a single "give billionaires more tax breaks" bill each term. |
| |
| ▲ | foxyv 3 hours ago | parent | prev | next [-] | | It is kind of comparing apples to oranges. The more appropriate would be to compare it with other Taxis. https://www.rubensteinandrynecki.com/brooklyn/taxi-accident-... Generally about 1 accident per 217k miles. Which still means that Tesla is having accidents at a 4x rate. However, there may be underreporting and that could be the source of the difference. Also, the safety drivers may have prevented a lot of accidents too. | | |
| ▲ | philistine 2 hours ago | parent [-] | | I'm sure insurers will love your arguments and simply insure Tesla at the exact same rate they insure everyone else. I think Tesla's egg is cooked. They need a full suite of sensors ASAP. Get rid of Elon and you'll see an announcement in weeks. | | |
| ▲ | bragr an hour ago | parent | next [-] | | Large fleet operators tend to self insure rather than having traditional auto insurance for what it's worth. If you have a large fleet, say getting in 5-10 accidents a year, you can't buy a policy that's going to consistently pay out more than the premium, at least not one that the insurance company will be willing to renew. So economically it makes sense to set that money aside and pay out directly, perhaps covering disastrous losses with some kind of policy. | |
| ▲ | harmmonica 2 hours ago | parent | prev [-] | | Always comes up but think it's worth repeating: if he's not there the stock will take a massive haircut and no Tesla investor wants that regardless of whether it would improve Tesla's car sales or its self-driving. Elon is the stock price for the most part. And just to muse on the current reason, it's not Optimus or self driving, but an eventual merger with SpaceX. My very-not-hot take is that they'll merge within months of the SpaceX IPO. A lot of folks say it ain't happening, but I think that's entirely dependent on how well Elon and Trump are getting along at the moment the merger is proposed (i.e., whether Trump gives his blessing in advance of any announcement). |
|
| |
| ▲ | flutas 2 hours ago | parent | prev [-] | | Yup as context, in the same time Waymo had 101 collisions according to the same NHTSA dataset. | | |
| ▲ | ra7 2 hours ago | parent | next [-] | | Waymo drives 4 million miles every week (500k+ miles each day). Vast majority of those collisions are when Waymos were stationary (they don’t redact narrative in crash reports like Tesla does, so you know what happened). That is an incredible safety record. | |
| ▲ | harmmonica 2 hours ago | parent | prev [-] | | Is this the same time or the same miles driven? I think the former, and of course I get that's what you wrote, but I'm trying to understand what to take away from your comment. |
|
|
|
| ▲ | thedougd 2 hours ago | parent | prev | next [-] |
| I would guess the FSD numbers get help from drivers taking over during difficult situations and use weighted towards highway miles? |
| |
|
| ▲ | an hour ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | cyberax an hour ago | parent | prev | next [-] |
| The old FSD was mostly used on freeways that naturally have a much lower incident rate per mile. And a lot of incidents that happen are caused by inattention/fatigue. So this number is plausible. |
|
| ▲ | sampton an hour ago | parent | prev [-] |
| I only flip on FSD when on the highway. It has come a long way but still too many problems on local roads. |