Remix.run Logo
Tesla is trying to hide 3 Robotaxi accidents(electrek.co)
85 points by coloneltcb 16 hours ago | 45 comments
1024core 16 hours ago | parent | next [-]

I got a 2026 model Y recently and tried out FSD. It made enough errors in the first few trips that I am surprised it's being touted as a "robotaxi".

For example: travelling West on 15th street in SF, at Guerrero the leftmost lane turns into left turn only and the Tesla happily continued straight through.

That jolted me out of complacence and the next time it was in the wrong lane, I quickly took over and corrected it. It's happened a few times and I don't use FSD that much.

tonfreed 14 hours ago | parent | next [-]

I'm unsurprised by that. I'm really hoping it quickly improves now that more people are using it

Zigurd 39 minutes ago | parent [-]

I've come to the conclusion that a big part of the difference between Waymo and other AVs is that Google has so much more geospatial data than anyone else that they know where that left turn lane begins and what it means in the context of that part of that road.

imoverclocked 16 hours ago | parent | prev | next [-]

You have to let it crash a few times so it can trigger an internal review of the route. /s

Having zero control of the software update process will stop me from ever owning a Tesla.

flowerthoughts 8 hours ago | parent | next [-]

My Mercedes has the opposite problem. It will notify me there's an OTA update and ask if I want to do it now, or not. If I just turned the car on, I probably don't want to do it now, so the question is silly. It's unclear if the "Later" option actually applies the update after I've turned the car off, or if it just means it'll nag me again later and the cycle repeats. At least it does map updates automatically.

johnasmith 15 hours ago | parent | prev [-]

I'm curious, what does having control over the update process give you? Isn't it replacing one unauditable black box system for another? Are you concerned about a regression and don't want to be in the vanguard cohort?

korse 19 minutes ago | parent | next [-]

My car/motorcycle/skateboard doesn't need over the air updates. It used to be, and still is in some cases, that a vehicle (electronic control modules and all) was sold as a finished product. Your engine control module or speed controller didn't need random firmware updates because it was a finished product that worked as intended upon delivery. Going electric Now people are clamoring to drive software licenses and I want no part of it. This isn't about auditing the code, this is about complexity creep and having ownership.

imoverclocked 14 hours ago | parent | prev [-]

Well, there is the "my car has been disabled in an inconvenient time/location" problem for one. It would be nice to have more audibility but I use iOS/macOS/etc so it would be disingenuous to claim that as a show-stopper.

If by "vanguard cohort" you mean "in the first wave to test the new software," then yes; I don't want to be in that group.

Incipient 10 hours ago | parent [-]

I feel like being concerned about your car being disabled at an inconvenient time, but not being as concerned about your phone/laptop isn't disingenuous.

They're entirely different products, costs, use cases, risk profiles.

imoverclocked 6 hours ago | parent [-]

For the most part, yes. I do fly with ForeFlight though. Losing it mid-flight would not be a disaster in its own right but the tech has saved my life a few times.

The ForeFlight team will send out a message giving an "all clear" or a "wait for us to update the app before updating to the next iOS/iPadOS release."

nrds 16 hours ago | parent | prev [-]

As you observed, lane selection is basically the one thing that FSD is completely incapable of. But other things it does well. It's important to note this is completely incompatible with the narrative spun by Tesla haters, that it all comes down to LiDAR. LiDAR cannot help with lane selection.

tim333 3 hours ago | parent | next [-]

I haven't really seen the narrative that it all comes down to lidar. I mean it's one sensor type amongst vision, lidar, gps, ultrasonics, sound and radar. For whatever reason Tesla has chosen to go a bit minimalist there.

Zigurd 37 minutes ago | parent [-]

TBF the Ford CEO, in an interview, said lidar is the difference. But I can't blame him for going with the sound bite in that context. No doubt he knows there are lots of differences. My favorite underappreciated difference is that Google has crazy amounts of geospatial data.

esseph 16 hours ago | parent | prev | next [-]

Why does Waymo not have a problem with it? It did really well in dense streets with people barely pulling over to stop and run into a storefront or picking people up from a restaurant. It would pause for a second, put on turning signals, and then pull around the stopped car. It did this several times, in fact in spots where I would have waited because its estimation of distance and obstacles in a 360deg around the vehicle is flat out better than me as a human. I was really impressed.

tim333 3 hours ago | parent | next [-]

Waymo seems to do a lot of detailed mapping in the areas where they operate. They probably have the lanes all marked out in the cars memory.

nrds 10 hours ago | parent | prev [-]

Waymos stop in the middle of the street several times a day, behavior I've never seen or even heard of from FSD. And I'm not sure what it has to do with lane selection.

FSD goes around stopped vehicles without any problem too.

esseph 5 hours ago | parent [-]

You're not sure what moving into a different lane has to do with lane selection?

barbazoo 16 hours ago | parent | prev [-]

How does Waymo compare in these situations?

guywithahat 16 hours ago | parent | prev | next [-]

How were the accidents hidden? It sounds like they were reported to the NHTSA properly, which is how the article knows about them. I wouldn't expect them to email a journalist every time there's an accident

ModernMech 16 hours ago | parent | next [-]

According to the article, at first the accidents were "hidden" from the reporting system just because Tesla systems were not autonomous enough to qualify under the law.

But now that Tesla is trying to be more autonomous robotaxi service, they're required to report more details about their accidents.

According to the article, Tesla's competitors (like Waymo) are very forthcoming about the incidents. They are probably following the long tradition in engineering of learning from your mistakes by investigating them thoroughly and doing root-cause analysis.

Tesla cannot do this, because if they do a thorough root-cause analysis of why their system fails more than others, they will inevitably arrive at the conclusion it's due to the sensor stack being camera-only. And Tesla cannot admit that because Musk can't admit he was wrong.

So instead they're going down the path of being cagey about the details of their accidents. I don't know how long these reports take to generate but there are 2.5 months worth of reports that have not yet been released.

Meanwhile, Musk has committed to ditching the safety monitors by the end of the year, and he's not going to be able to do that if Tesla's robotaxi service is unreliable. But he's also not willing to do what it takes to make the service more reliable, which is add LiDAR to the system. So... it will be interesting to see what happens at the end of the year.

pitpatagain 14 hours ago | parent | next [-]

It's already clear that there is no possible timeline in which they actually remove safety drivers by the end of the year, it's such a joke.

The weird thing is that between the extremely underwhelming tiny supervised test they run in Austin and the nonsensical permitting games they want to play in California, they don't really seem like a company that actually wants to launch a robotaxi.

Zigurd 29 minutes ago | parent | next [-]

> they don't really seem like a company that actually wants to launch a robotaxi.

Here is a prediction: when they don't actually get to remove the safety drivers, Elon will blame regulators and rage quit the Robo taxi game.

apothegm 13 hours ago | parent | prev | next [-]

The past 5 years or so they’ve looked more like a pump-and-dump scheme masquerading as a car manufacturer, so that seems on brand.

ModernMech an hour ago | parent | prev [-]

> they don't really seem like a company that actually wants to launch a robotaxi.

Because they can't. They don't have the technology to do so, despite promising for years it's right around the corner. Musk backed Tesla into a corner by promising dates and missing them several times, and this is just another instance of that. They're playing a shell game and they've been able to hide the ball so far by calling things "beta" or a "rollout" or "supervised", but when it comes to robot axis they have to actually be autonomous, and Tesla tech cannot deliver that.

So all I'm wondering is where they're going to hide the ball next. I don't think they can push robotaxis any longer, which is why you see Musk preemptively suggesting robots and AI are the future of Tesla. Actually I think he's more likely to claim victory in self driving, ditch the entire car company saying it's so last century, and pivot Tesla into robotics than to actually release failure robotaxis. It's the only way he can keep the grift going; the self driving grift is done.

Animats 15 hours ago | parent | prev [-]

> And Tesla cannot admit that because Musk can't admit he was wrong.

Führerprinzip [1]

[1] https://en.wikipedia.org/wiki/F%C3%BChrerprinzip

pitpatagain 16 hours ago | parent | prev | next [-]

"As it does with its ADAS crash reporting, Tesla is hiding most details about the crashes. Unlike its competitors, which openly release narrative information about the incidents, Tesla is redacting all the narrative for all its crash reporting to NHTSA"

josefritzishere 16 hours ago | parent | prev | next [-]

TLDR the article clearly states that Tesla misclassifies the severity of the accidents and redacts the narratives and disengagement data. Key sentence "Tesla has never released any significant data to prove that its system is reliable."

guywithahat 16 hours ago | parent [-]

I read the article, it's just that the data was never, and is not, hidden. Data for vehicles that aren't fully autonomous isn't released, and they are releasing their fully autonomous data through the proper channels.

At no point was tesla ever trying to "hide 3 robotaxi accidents", as the title claims (unless I'm missing something but I don't think I am)

pitpatagain 14 hours ago | parent [-]

Read the table of examples in the article. Other companies report crashes with significant detail visible to the public that Tesla is redacting.

Compare Waymo report:

"On [XXX] at 10:31 PM PT a Waymo Autonomous Vehicle ("Waymo AV") operating in San Francisco, California was in a collision involving a scooterist on [XXX] at [XXX].

The Waymo AV was stopped at the curb facing north on [XXX] for a passenger drop-off when the passenger in the Waymo AV opened the rear right door. As the rear right door was being opened by the passenger, a scooter ....

Waymo is reporting this crash under Request No. 2 of Standing General Order 2021-01. Waymo may supplement or correct its reporting with additional information as it may become available."

Tesla reports is:

"[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]"

Tesla has consistently tried to have it both ways saying they are "not autonomous" and therefore don't have to report, but also then claiming in other contexts that they are driving huge numbers of "autonomous" miles.

So now they finally do a handful of reports and it's all REDACTED? They are finally doing barely what's required but also not being forthcoming at all.

umeshunni 16 hours ago | parent | prev [-]

[flagged]

14 hours ago | parent [-]
[deleted]
barbazoo 16 hours ago | parent | prev | next [-]

> Unlike competitors, such as Waymo, Tesla’s Robotaxi still uses a “safety monitor” who sits in the front seat with a finger on a kill switch ready to stop the vehicle. Despite this added level of safety, Tesla is evidently still experiencing crashes.

> CEO Elon Musk has claimed that Tesla would remove the safety monitor by the end of the year and deliver on its “full self-driving” promises to customers, but he has never shared any data proving that Tesla’s automated driving system is reliable enough to achieve that.

4d4m 13 hours ago | parent | prev | next [-]

Not great. I remember reading an article that the most sucessful self driving company literally sued a DMV to keep its details under wraps! How's that for transparent, responsible behavior? It would be easy to say your regulators and politicians aren't sleeping at the wheel - they're helping run you over. Or is that hyperbolic?

tim333 3 hours ago | parent [-]

A bit hyperbolic. Over the longer term self driving tech will probably be a major benefit for road safety.

Excluding the level 3, driver is supposed to be in charge stuff, I can recall one death in autonomous driving research - the one that ended Uber's program. For comparison last year the US recorded 39,345 motor vehicle traffic fatalities.

its-kostya 16 hours ago | parent | prev | next [-]

Any "self driving" from Tesla carries large amount of risk because it uses _only_ cameras. Visual anomalies happen and without radar/lidar as a second source of truth, the vehicles will always sketch me out. Some say "separate the art from the artist" but at the end of the day, Elon's stubbornness to only use cameras is the reason many people are apprehensive to shell out money for the vehicle and especially the any autonomous driving capabilities.

Even if future vehicles DID have lidar, every vehicle up to now does not and therefore will never be truly self-driving. Customers already paid for it with the promise that vehicle hardware is capable. So either they will have to be refunded, or retro fitted with new sensors - at the expense of Tesla I assume. Still no idea how they are valued so much.

taylodl 16 hours ago | parent [-]

My benchmark for full self-driving is simple: if the manufacturer assumes full legal responsibility and liability for the vehicle’s actions, then it qualifies as autonomous. Otherwise, it’s just driver assistance and isn't capable of being an autonomous taxi.

jerlam 16 hours ago | parent [-]

This begs the question whether the safety monitors in the Robotaxis are also there to take blame for any of the vehicles' accidents.

taylodl 15 hours ago | parent [-]

I had never even thought of that! I had just assumed that Tesla would be responsible, but you know what they say about "assume!" Holy Cow! Now you've got me thinking there's another reason to not utilize a Robotaxi, buried in the terms and conditions that nobody reads could be a statement where you take full responsibility for everything the Robotaxi does. Yikes!

TheAlchemist 16 hours ago | parent | prev | next [-]

I don't know when Tesla valuation will crash and Musk will go bankrupt, but once it does it will be one for the ages !

Company is still valued at >>1 trilion $, supposedly because they will soon roll out Robotaxis everywhere - 50% of US population before the end of the year, according to Musk !

Meanwhile, 3 months after the start of the operation, it's still open only for influencers, running with ±10 vehicles and operates with a driver in the front seat...

This is so absurd, that could make us forget the 2 million Cybertruck orders or the fact that all Teslas were to become Robotaxis with an OTA update in 2020.

xnx 16 hours ago | parent | next [-]

It is ridiculous, but it is even odds that he'll just start promising something else to string people along: robots!, AGI!, Eloncoin!

lawn 7 hours ago | parent | prev [-]

If this was a fictional sci-fi story people would trash it as too unrealistic.

Simulacra 16 hours ago | parent | prev | next [-]

Well they're not hidden now?

Fricken 14 hours ago | parent | prev | next [-]

Magic didn't work, maybe having a Nazi as CEO will help get Tesla to SAE level 4.

salamanderr 16 hours ago | parent | prev | next [-]

[dead]

frendiversity 16 hours ago | parent | prev [-]

[flagged]