Remix.run Logo
Tesla 'Full Self-Driving' crashed through railroad gate seconds before train(electrek.co)
51 points by Bender 7 hours ago | 21 comments
LorenPechtel 5 hours ago | parent | next [-]

I suspect this comes down to the same problem we've seen in other forms--their system stinks at detecting that a stationary object is in the road.

AustinDev 5 hours ago | parent | next [-]

This is one thing LIDAR is pretty good at.

dzhiurgis 3 hours ago | parent [-]

Do people still believe this is the clutch?

I see way more crash compilations from Waymo than Tesla (despite having something like 300k FSD subscribers and over 1M permanent purchasers).

Sure LIDAR can fill like 5% of gaps, but let's not pretend it's the underlying AI model that does the grunt work. Which begs the question why Waymo hasn't scaled nationwide and why cybercab hasn't ramped up yet. Both aren't doing that amazing.

jerlam an hour ago | parent | next [-]

Probably selection effect. Tesla owners with FSD are often aware of its shortcomings and will not use it in situations where it wouldn't work, much less post clips of their mistakes online. People seem to agree it works fine on highways where cars travel in consistent patterns.

Waymos are in the exact opposite situation. They only run in busy cities so there are lots of bystanders to take a video of the situation, including the passenger, who has no incentive to hide the issue. Waymos can't revert to a driver in the car when things get tough; they call back to their monitoring center and come to a halt, which draws further attention and mockery.

You cannot assume that online algorithms are giving you a unbiased, neutral view of the world. They are specifically tuned against that.

dzhiurgis 12 minutes ago | parent | next [-]

But assertion is that FSD is more dangerous as people don't monitor situation until it is too late.

Claiming there are no Tesla's in busy cities is ridiculous.

Given all the scrutiny Tesla gets (good, it made them unstoppable) you'd expect all sort of activists driving to Austin and literally crashing into robotaxis.

FireBeyond 22 minutes ago | parent | prev [-]

> will not use it in situations where it wouldn't work

Often "cannot". FSD will refuse to engage in those situations, often.

But Elon will trot out "so much safer", omitting "for some conditions, on some road, in some weather", versus "all drivers, all conditions, all road, all weather".

"You see, we win the vast majority of games when we just don't play the ones we thought we might lose!"

fooblaster 3 hours ago | parent | prev [-]

Tesla has not pulled the driver. It's just not comparable.

dzhiurgis 16 minutes ago | parent | next [-]

Says 11 vehicles unsupervised https://robotaxitracker.com/?provider=tesla

dangus 2 hours ago | parent | prev [-]

Their website now prominently states “supervised” since they got into so much hot water overselling the capabilities.

Tesla FSD is really in a pointless middle ground where the steep $99/month they ask for it is just not worth it.

It does basically nothing for you on the highway to alleviate fatigue above and beyond a standard adaptive cruise control system you can find in a Volkswagen Jetta.

The FSD on city streets is not autonomous enough to take away supervision so for the 10-20 minutes people typically spend driving in city traffic situations before reaching their destination it’s not saving a whole lot of effort to just…drive yourself.

I would think if I owned a car that wasn’t an old ass beater like I have I would mainly benefit from adaptive cruise control on long trips and perhaps some convenience stuff like automatic parking.

dzhiurgis 10 minutes ago | parent [-]

What is the point of such trolling?

FireBeyond 24 minutes ago | parent | prev [-]

Tesla has always had a weirdness with trains. A couple of years ago in Pennsylvania, I watched, bemused, as a train rolled by at a crossing (we were driving manually). It looked like an erratic convoy of trucks, depending on whether there was a container on the car or not.

Tesla stans will say "well, just because it doesn't visualize the train properly doesn't mean it doesn't know it's a train", but shit like this today just bolsters that that's garbage.

I still want to see how Tesla does in my town where there's a fun intersection, where four lanes coming west hit a T. Drivers can turn north or south, but there's only two lanes on the north south road, so there's a sequence where the left two lanes can turn north, or south, and then the right two lanes can do it (i.e. staggered so drivers in the left two lanes turning north don't hit drivers in the right two turning south, and also don't have to try to merge 4 lanes into 2 while turning).

I guarantee FSD would absolutely shit the bed (sorry, I mean, "disengage" to preserve Elon's stats, I mean "your safety") on this intersection.

It's not ready for primetime. And it's still not close.

qwerpy 3 hours ago | parent | prev | next [-]

My gated community has a gate similar to a railroad gate. My FSD 12 HW3 model Y cannot be trusted at it. My FSD 14 HW4 Cybertruck does fine except if another car is in front of me. Then it tries to tailgate the car in. Strangely, the Y has the ultrasonic distance sensors and the cybertruck does not. The truck seems to be able to handle the gate detection but doesn’t understand the rule that only one car can go at a time.

That being said, if I were first in line at a railroad crossing I think I’d disengage FSD to be safe. If I were in a Waymo I’d be very nervous. LiDAR or not, an error can be catastrophic.

strogonoff 3 hours ago | parent [-]

If one claims that an error at a railroad gate can be catastrophic and therefore FSD should be disabled in that situation, how does one ethically reconcile that with enabling FSD on any regular street with pedestrians?

The principal difference that comes to mind is that in the latter case it would be catastrophic to others as opposed to yourself: you are the train in that situation, except pedestrians have no airbags and without the railroad gate equivalent they are not made aware of taking this risk.

qwerpy 3 hours ago | parent [-]

That’s a very interesting way to look at it! But my reasoning for continuing to do what I do is that FSD is bad at thin gates and much better at avoiding pedestrians. So it’s not an all or nothing thing for me.

6 hours ago | parent | prev | next [-]
[deleted]
dzhiurgis 4 hours ago | parent | prev [-]

[flagged]

novemp 4 hours ago | parent | next [-]

Apr 15 2026

dzhiurgis 3 hours ago | parent [-]

That video has been posted to r/teslaFSD weeks ago

steve_adams_86 3 hours ago | parent [-]

I can't seem to find any evidence of that. The oldest reference I can find is less than a week ago: https://www.reddit.com/r/TeslaFSD/comments/1shqut4/hw3_vehic...

Maybe I'm not sleuthing hard enough. Most reporting I can find on it is from today.

dzhiurgis 10 minutes ago | parent [-]

So if read the link that you posted - there's link to posters profile:

https://www.facebook.com/wildduce22/posts/pfbid0reXge89aqZGa...

April 10 at 3:34 AM

thunderfork 4 hours ago | parent | prev [-]

Still on the road, just happened, still news. Wouldn't be news if there weren't alpha test vehicles with real people in them on real roads, but there are, so...