Remix.run Logo
Tesla changes meaning of 'Full Self-Driving', gives up on promise of autonomy(electrek.co)
514 points by MilnerRoute 6 days ago | 125 comments
keeda 5 days ago | parent | next [-]

I strongly believe LIDAR is the way to go and that Elon's vision-only move was extremely "short-sighted" (heheh). There are many reasons but that drives it home for me multiple times a week is that my Tesla's wipers will randomly sweep the windshield for absolutely no reason.

This is because the vision system thinks there is something obstructing its view when in reality it is usually bright sunlight -- and sometimes, absolutely nothing that I can see.

The wipers are, of course, the most harmless way this goes wrong. The more dangerous type is when it phantom-brakes at highway speeds with no warning on a clear road and a clear day. I've had multiple other scary incidents of different types (swerving back and forth at exits is a fun one), but phantom braking is the one that happens quasi-regularly. Twice when another car was right behind me.

As an engineer, this tells me volumes about what's going on in the computer vision system, and it's pretty scary. Basically, the system detects patterns that are inferred as its vision being obstructed, and so it is programmed to brush away some (non-existent) debris. Like, it thinks there could be a physical object where there is none. If this was an LLM you would call it a hallucination.

But if it's hallucinating crud on a windshield, it can also hallucinate objects on the road. And it could be doing it every so often! So maybe there are filters to disregard unlikely objects as irrelevant, which act as guardrails against random braking. And those filters are pretty damn good -- I mean, the technology is impressive -- but they can probabistically fail, resulting in things that we've already seen, such as phantom-braking, or worse, driving through actual things.

This raises so many questions: What other things is it hallucinating? And how many hardcoded guardrails are in place against these edge cases? And what else can it hallucinate against which there are no guardrails yet?

And why not just use LIDAR that can literally see around corners in 3D?

jqpabc123 5 days ago | parent | next [-]

Engineering reliability is primarily achieved through redundancy.

There is none with Musk's "vision only" approach. Vision can fail for a multitude of reasons --- sunlight, rain, darkness, bad road markers, even glare from a dirty windshield. And when it fails, there is no backup plan -- the car is effectively driving blind.

Driving is a dynamic activity that involves a lot more than just vision. Safe automated driving can use all the help it can get.

CjHuber 5 days ago | parent | prev | next [-]

Just imagine Tesla would have subventioned passive LIDAR on every car they ship to collect data. Wow that dataset would be crazy, and would even improve their vision models by having ground truth to train on. He’s such a moron

amelius 5 days ago | parent | prev | next [-]

Your comparison to hallucination is spot on.

LLMs have shown the general public how AI can be plain wrong and shouldn't be trusted for everything. Maybe this influences how they, and regulators, will think about self driving cars.

danans 5 days ago | parent | prev | next [-]

> And why not just use LIDAR that can literally see around corners in 3D?

Based on what I've read over the years: it costs too much for a consumer vehicle, it creates unwanted "bumps" in the vehicle visual design, and the great man said it wasn't needed.

Yes, those reasons are not for technology or safety. They are based on cost, marketing, and personality (of the CEO and fans of the brand).

beng-nl 5 days ago | parent | prev | next [-]

I’ve always wondered about LiDAR - how can multiple units sweep a scene at the same time (as would be the case for multiple cars driving close together, all using lidar)? One unit can’t distinguish return signals between itself and other units, can it?

fossuser 5 days ago | parent | prev | next [-]

I use FSD in my Model S daily to commute from SF to Palo Alto along with most of my other Bay Area driving. It does a better job currently than most people and it drives me 95% of the time now I haven't had the phantom braking.

I'm in a 2025 with HW4, but it's dramatic improvement over the last couple of years (previously had a 2018 Model 3) increased my confidence that Elon was right to focus on vision. It wasn't until late last year where I found myself using it more than not, now I use it almost every drive point to point (Cupertino to SF) and it does it.

I think people are generally sleeping on how good it is and the politicization means people are under valuing it for stupid reasons. I wouldn't consider a non Tesla because of this (unless it was a stick shift sports car, but that's for different reasons).

Their lead is so crazy far ahead it's weird to see this reality and then see the comments on hn that are so wrong. Though I guess it's been that way for years.

The position against lidar was that it traps you in a local max, that humans use vision, that roads and signs are designed for vision so you're going to have to solve that problem and when you do lidar becomes a redundant waste. The investment in lidar wastes time from training vision and may make it harder to do so. That's still the case. I love Waymo, but it's doomed to be localized to populated areas with high-res mapping - that's a great business, but it doesn't solve the general problem.

If Tesla keeps jumping on the vision lever and solves it they'll win it all. There's nothing in physics that makes that impossible so I think they'll pull it off.

I'd really encourage people to here with a bias to dismiss to ignore the comments and just go in real life to try it out for yourself.

teleforce 4 days ago | parent | prev | next [-]

>why not just use LIDAR that can literally see around corners in 3D?

LIDAR requires line-of-sight (LoS) hence cannot see around conner, but RADAR probably can.

It's interesting to note that the all time 2nd most popular post on Tesla is 9 years ago on its full self driving hardware (just 2nd after the controversial Cybertruck) [1].

>Elon's vision-only move was extremely "short-sighted"

Elon's vision was misguided because some of the technologists at the time including him seem to really truly believed that AGI is just around the corner (pun attended). Now most of the tech people gave up on AGI claim blaming on the blurry definition of AGI but for me the truly killer AGI application is always full autonomous level 5 driving with only human level sensor perceptions minus the LIDAR and RADAR. But the complexity of the goal is very complicated that I really truly believe it will not be achieved in foreseeable future.

[1] All Tesla Cars Being Produced Now Have Full Self-Driving Hardware (2016 - 1090 comments):

https://news.ycombinator.com/item?id=12748863

UltraSane 5 days ago | parent | prev | next [-]

Camera only might work better if you used regular digital cameras along with more advanced cameras like event based cameras that send pixels as soon as they change brightness and have microsecond latency and\or Single Photon Avalanche Diode (SPAD) sensors which can detect single photons. Having the same footage from all 3 of these would enable some fascinating training options.

But Tesla didn't do this.

cuttothechase 5 days ago | parent | prev | next [-]

Cannot agree more on this phantom braking.

I rented a Tesla a while back and drove from the bay to the death valley. On clear roads with no hazards whatsoever, the car hit the brakes at highway speeds. It scared the bejeesus out of me! Completely off put by the auto drive and derailed plans to buy a Tesla.

duxup 5 days ago | parent | prev | next [-]

The around corners thing, when I saw demos of it seeing the vehicles the driver can't even see ... I wanted it for my non self driving car ... it's just too big of an advantage to skimp out on.

JumpCrisscross 5 days ago | parent | prev | next [-]

> maybe there are filters to disregard unlikely objects as irrelevant, which act as guardrails against random braking

The filters introduce the problem of incorrectly deleting something that really is there.

paradox460 5 days ago | parent | prev | next [-]

Whats the oddest thing about the wiper tech is that we've had the tech for automated wipers since at least the 70s. As a kid my neighbor's Cadillac had it.

tl;dr: you can use optics to determine if there's rain on a surface, from below, without having to use any fancy cameras or anything, just a light source and light sensor.

If you're into this sort of thing, you can buy these sensors and use them as a rain sensor, either as binary "yes its rained" or as a tipping bucket replacement: https://rainsensors.com

mellosouls 5 days ago | parent | prev | next [-]

Elon's vision-only move was extremely "short-sighted" (heheh)

Careful. HN takes a dim view of puns.

fred_is_fred 5 days ago | parent | prev | next [-]

"There are many reasons but that drives it home for me multiple times a week is that my Tesla's wipers will randomly sweep the windshield for absolutely no reason."

Self-starting wipers uses some kind of current/voltage measure on the windshield right - unrelated to self-driving? It's been around longer than Tesla - or are you just saying it's another random failure?

debo_ 5 days ago | parent | prev | next [-]

I upvoted this just for "short-sighted."

diebeforei485 4 days ago | parent | prev | next [-]

How does lidar see around corners?

maxlin 5 days ago | parent | prev | next [-]

This and that. FUD this FUD that. Tesla have communicated clearly why "adding" LiDAR isn't an improvement for a system with goals as high as their are. Remember, no vision system yet is as good as humans are with vision, so obviously there's a lot to do with vision still.

Check this for a reference of how well Tesla's vision-only fares against the competition, where many have LiDAR. Keep it simple wins the game. https://www.youtube.com/watch?v=0xumyEf-WRI

moralestapia 5 days ago | parent | prev | next [-]

>Elon's vision-only move was extremely "short-sighted"

It wasn't Elon's but Karpathy's.

qoez 5 days ago | parent | prev | next [-]

Not sure it was actually Elon's move though, I heard it was mainly a decision taken by Andrej Karpathy

weinzierl 5 days ago | parent | prev | next [-]

I think Elon's prediction was that LIDAR was too expensive and will stay too expensive. In a sense he was right, LIDAR prices did not drop and I wonder why that is?

enslavedrobot 5 days ago | parent | prev | next [-]

Are you referring to autopilot or FSD? Phantom braking is a solved problem since the release of V12 FSD. As soon as a vision based car is safer than a human, it's flaws don't matter because it will save lives.

Supervised FSD is already safer than a human.

jillesvangurp 5 days ago | parent | prev | next [-]

Lidar is great for object detection. But it's not great for interpreting the objects. It will stop you crashing into a traffic light. But it won't be able to tell the color of the light. It won't see the stripes on the road. It won't be able to tell signs apart. It won't enable AIs to make sense of the complex traffic situations.

And those complex traffic situations are the main challenge for autonomous driving. Getting the AIs to do the right things before they get themselves into trouble is key.

Lidar is not a silver bullet. It helps a little bit, but not a whole lot. It's great when the car has to respond quickly to get it out of a situation that it shouldn't have been in to begin with. Avoiding that requires seeing and understanding and planning accordingly.

alex1138 5 days ago | parent | prev | next [-]

I've defended some of Musk because I think what he did for Twitter was completely necessary (showing Jay Bhattacharya that the old regime had put him on a trends blacklist, and all the other people who got banned for no reason) but things like this (and Tesla's already been accused of killing people through crashes) are alarming (vision only as opposed to multiple telemetry) and it's kind of amazing he's in charge of something like SpaceX (are we about to witness a fatal incident in space?)

gcanyon 5 days ago | parent | prev | next [-]

The wiper system has nothing to do with self-driving -- it's based on total internal reflection in the glass: https://www.youtube.com/watch?v=TLm7Q92xMjQ

torginus 5 days ago | parent | prev | next [-]

The mistakes you describe are the issues of the AI system controlling the car, not of the cameras themselves. If you were watching the camera feed and teleoperating the vehicle, no way you'd phantom brake at a sudden bit of glare.

chippiewill 5 days ago | parent | prev | next [-]

As someone who worked in this space, you are absolutely right, but also kind of wrong - at least in my opinion.

The cold hard truth is that LIDARs are a crutch, they're not strictly necessary. We know this because humans can drive without a LIDAR, however they are a super useful crutch. They give you super high positional accuracy (something that's not always easy to estimate in a vision-only system). Radars are also a super useful crutch because they give really good radial velocity. (Little anecdote, when we finally got the Radars working properly at work it made a massive difference to the ability for our car to follow other cars, ACC, in a comfortable way).

Yes machine learning vision systems hallucinate, but so do humans. The trick for Tesla would be to get it good enough to where it hallucinates less than humans do (they're nowhere near yet - human's don't hallucinate very often).

It's also worth adding that last I checked the state of the art for object detection is early fusion where you chuck the LIDAR and Radar point clouds into a neural net with the camera input so it's not like you'd necessarily have the classical methods guardrails with the Lidar anyway.

Anyway, I don't think Tesla were wrong to not use LIDAR - they had good reasons to not go down that route. They were excessively expensive and the old style spinning LIDARs were not robust. You could not have sold them on a production car in 2018. Vision systems were improving a lot back then so the idea you could have a FSD on vision alone was plausible.

zpeti 5 days ago | parent | prev [-]

If a human brain can tell the difference between sun glare and an object, machine learning certainly can.

It’s already better at X-rays and radiology in many cases.

Everything you are talking about is just a matter of sufficient learning data and training.

dlcarrier 5 days ago | parent | prev | next [-]

This looks to me like they are acknowledging that their claims were premature, possibly due to claims of false advertising, but are otherwise carrying forward as they were.

Maybe they'll reach level 4 or higher automation, and will be able to claim full self driving, but like fusion power and post-singularity AI, it seems to be one of those things where the closer we get to it, the further away it is.

sschueller 5 days ago | parent | next [-]

Premature? Is that what we call this now? It's straight up fraud!

Others are in prison for far less.

dreamcompiler 5 days ago | parent | prev | next [-]

Not gonna happen as long as Musk is CEO. He's hard over on a vision-only approach without lidar or radar, and it won't work. Companies like Waymo that use these sensors and understand sensor fusion are already eating Tesla's lunch. Tesla will never catch up with vision alone.

crooked-v 5 days ago | parent | prev | next [-]

So does anyone who previously bought it on claims that actual full self-driving would be "coming soon" get refunds?

epolanski 5 days ago | parent | prev | next [-]

This is fraud he went in front of investors and said multiple times it was around the corner.

He said consumers, just buy the car and it will come with an updated. It didn't.

This is a scam, end of story.

7 years of it.

jojobas 5 days ago | parent | prev | next [-]

>false advertising

I think you mean "securities fraud", at gargantuan scale at that. Theranos and Nikola were nowhere near that scale.

gitaarik 5 days ago | parent | prev | next [-]

But, they're changing the meaning of FSD to FSD (Supervised). So that means they don't make any promises for unsupervised FSD in the future anymore. They'll of course say that they keep working on it and that stuff is progressing. But they don't have to deliver anymore. Just like they say to people getting into accidents that they should keep their arms on the wheel or else it's your own responsibility.

jeffbee 5 days ago | parent | prev | next [-]

> Maybe they'll reach level 4 or higher automation

There is little to suggest that Tesla is any closer to level 4 automation than Nabisco is. The Dojo supercomputer that was going to get them there? Never existed.

standardUser 5 days ago | parent | prev | next [-]

What does Waymo lack in your opinion to not be considered "full self driving"?

The persistent problem seems to be severe weather, but the gap between the weather a human shouldn't drive in and weather a robot can't drive in will only get smaller. In the end, the reason to own a self-driven vehicle may come down to how many severe weather days you have to endure in your locale.

matthewdgreen 4 days ago | parent | prev | next [-]

Do Waymos (without safety driver in the car) count as FSD?

bradhe 5 days ago | parent | prev [-]

> This looks to me like they are acknowledging that their claims were premature, possibly due to claims of false advertising, but are otherwise carrying forward as they were.

Delusionaly generous take. Perhaps even zealotry.

an0malous 5 days ago | parent | prev | next [-]

How have they gotten away with such obvious misadvertising for this long? It’s undeniably misled customers and inflated their stock value

dreamcompiler 5 days ago | parent | next [-]

Normally the Board of Directors would fire any CEO that destroyed as much of the company's value as Musk has. But Tesla's board is full of Musk syncophants and family members who refuse to stand up to him.

Eddy_Viscosity2 5 days ago | parent | prev [-]

Who was going to stop them from lying?

AbrahamParangi 5 days ago | parent | prev | next [-]

I use self-driving every single day in Boston and I haven’t needed to intervene in about 8 months. Most interventions are due to me wanting to go a different route.

Based on the rate of progress alone I would expect functional vision-only self-driving to be very close. I expect people will continue to say LIDAR is required right up until the moment that Tesla is shipping level 4/5 self-driving.

rogerrogerr 5 days ago | parent | next [-]

Same experience in a mix of city/suburban/rural driving, on a HW3 car. Seeing my car drive itself through complex scenarios without intervention, and then reading smart people saying it can’t without hardware it doesn’t have, gives major mental whiplash.

rootusrootus 5 days ago | parent | prev | next [-]

I would like to get my experience more in line with yours. I can go a few miles without intervention, but that's about it, before it does something that will result in damage if I don't take over. I'm envious that some people can go months when I can't go a full day.

FollowingTheDao 5 days ago | parent | prev | next [-]

Self driving is not the same as "autonomy". Musk lied to everyone with the Tesla self driving, the Boring Company, DOGE...wake up people...

herbturbo 5 days ago | parent | prev | next [-]

> Based on the rate of progress alone I would expect functional vision-only self-driving to be very close.

So close yet so far, which is ironically the problem vision based self-driving has. No concrete information just a guess based on the simplest surface data.

potato3732842 5 days ago | parent | prev [-]

On a scale from "student driver" to "safelite guy (or any other professional who drives around as part of their job) running late" how does it handle storrow and similiar?

Like does it get naively caught in stopped traffic for turns it could lane change out or does it fucking send it?

Nitsua007 5 days ago | parent | prev | next [-]

Small correction: LiDAR can’t literally see around corners — it’s still a line-of-sight sensor. What it can do is build an extremely precise 3D point cloud of what it can see, in all lighting conditions, and with far less susceptibility to “hallucinations” from things like glare, shadows, or visual artifacts that trip up purely vision-based systems.

The problem you’re describing — phantom braking, random wiper sweeps — is exactly what happens when the perception system’s “eyes” (cameras) feed imperfect data into a “brain” (compute + AI) that has no independent cross-check from another modality. Cameras are amazing at recognizing texture and color but they’re passive sensors, easily fooled by lighting, contrast, weather, or optical illusions. LiDAR adds active depth sensing, which directly measures distance and object geometry rather than inferring it.

But LiDAR alone isn’t the endgame either. The real magic happens in sensor fusion — combining LiDAR, radar, cameras, GNSS, and ultrasonic so each sensor covers the others’ blind spots, and then fusing data at the perception level. This reduces false positives, filters out improbable hazards before they trigger braking, and keeps the system robust in edge cases.

And there’s another piece that rarely gets mentioned in these debates: connected infrastructure. If the vehicle can also receive data from roadside units, traffic signals, and other connected objects (V2X), it doesn’t have to rely solely on its onboard sensors. You’re effectively extending the vehicle’s situational awareness beyond its physical line of sight.

Vision-only autonomy is like trying to navigate with one sense while ignoring the others. LiDAR + fusion + connectivity is like having multiple senses and a heads-up from the world around you.

moomoo11 4 days ago | parent [-]

Honestly at that point of complexity I hope automakers just quit chasing FSD and go back to making actually good cars again.

Let the automated trucks figure it out if it’s an actual problem worth solving or we can just use trains or let truck driving be a decent middle class job.

jesenpaul 5 days ago | parent | prev | next [-]

They made tons of money on the Scam of the Decade™ from Oct 2016 (See their "Driver is just there for legal reasons" video) to Apr 2024 (when they officially changed it to Supervised FSD) and now its not even that.

mettamage 5 days ago | parent | prev | next [-]

I’m not surprised. As a former Elon fan, it never struck me that he thought about this from first principles, whereas for SpaceX he did.

For as long as we can’t understand AI systems as well as we understand normal code, first principles thinking is out of reach.

It may be possible to get FSD another way but Elon’s edge is gone here.

fsmv 5 days ago | parent [-]

SpaceX is a success despite Elon. Maybe setting an extremely lofty goal helped somewhat but Gwynne Shotwell and all the actual engineers at SpaceX deserve the credit for their success.

goloroden 5 days ago | parent | prev | next [-]

I think I’d call what Tesla did fraud. Or scam. Or both.

ciconia 5 days ago | parent | prev | next [-]

War Is Peace. Freedom Is Slavery. Ignorance Is Strength. FSD is... whatever Elon says it is.

smoovb 3 days ago | parent | prev | next [-]

Here's what Claude has to say about electrek.co:

Tesla Headlines Sentiment Analysis - Electrek.co Bottom Line: Strongly Negative Sentiment Based on analysis of Tesla headlines and articles from Electrek over the past few months, the sentiment is overwhelmingly negative (approximately 85% negative, 10% neutral, 5% positive). The coverage reveals a company in decline across multiple fronts.

IgorPartola 5 days ago | parent | prev | next [-]

I don’t need self driving cars that can navigate alleys in Florence, Italy and also parkways in New England. Here is what we really need: put transponders into the roadway on freeways and use those for navigation and lane positioning. Then you would be responsible for getting onto the freeway and getting off the exit but can take a nap between. This would be something that would be do e by the DOT, supported by all car makers, and benefit everyone. LIDAR could be used for obstacle detection but not for navigation. And whoever figures out how to do the transponders and land a government contract and get at least one major car manufacturer on board would make bank.

hedora 5 days ago | parent | next [-]

We live in an area with sort of challenging roads, and I strongly disagree.

There’s an increasing number of drivers that can barely drive on the freeways. When they hit our area they cannot even stay on their side of the road, slow down for blind curves (when they’re on the wrong side of the road!), maintain 50% the normal speed of other drivers, etc. I won’t order uber or lyft anymore because I inevitably get one of these people as my driver (and then watch them struggle on straight stretches of freeway).

Imagine how much worse this will get when they start exclusively using lane keeping on easy roads. It’ll go from “oh my god I have to work the round wheel thingy and the foot levers at the same time!” to “I’ve never steered this car at speeds above 11”.

I’d much rather self driving focused on driving safely on challenging roads so that these people don’t immediately flip their cars (not an exaggeration; this is a regular occurrence!) when the driver assistance disables itself on our residential street.

I don’t think addressing this use case is particularly hard (basically no pedestrians, there’s a double yellow line, the computer should be able to compute stopping distance and visibility distance around blind curves, typical speeds are 25mph, suicidal deer aren’t going to be the computer’s fault anyway), but there’s not much money in it. However, if you can’t drive our road, you certainly cannot handle unexpected stuff in the city.

heeton 5 days ago | parent | prev | next [-]

We already have transponders on freeways. They’re technically passive reflectors, but they reflect a high proportion of incident EM waves, in the visible spectrum, and exist between lanes on every major road in the US. Also known as white paint.

gilbetron 5 days ago | parent | prev | next [-]

Following roads and lane markers and signs and signals is the "easy" part of autonomous driving. You could do everything you say and it wouldn't result in something that is any better than the current state of the art. Dealing with others on the road is the main problem (weather comes in close second). Your solution solves nothing, I'm afraid.

randunel 5 days ago | parent | prev [-]

How would you know which signals to trust and which to ignore?

shadowgovt 5 days ago | parent | prev | next [-]

"Full Self Driving (Supervised)." In other words: you can take your mind off the road as long as you keep your mind on the road. Classic.

Tesla is kind of a joke in the FSD community these days. People working on this problem a lot longer than Musk's folk have been saying for years that their approach is fundamentally ignoring decades of research on the topic. Sounds like Tesla finally got the memo. I mostly feel sorry for their engineers (both the ones who bought the hype and thought they'd discover the secret sauce that a quarter-century-plus of full-time academic research couldn't find and the old salts who knew this was doomed but soldiered on anyway... but only so sorry, since I'm sure the checks kept clearing).

arijun 5 days ago | parent | next [-]

Until very recently I worked in the FSD community, and I wouldn’t say I viewed it as a joke. I don’t know if I believed they would get to level 5 without any lidar, it’s pretty good for what’s available in the consumer market.

FireBeyond 4 days ago | parent | prev [-]

> In other words: you can take your mind off the road as long as you keep your mind on the road.

They literally did this with Summon. "Have your car come to you while dealing with a fussy child" - buried far further down the page in light grey, "pay full attention to the vehicle at all times" (you know, other than your "fussy child").

guluarte 5 days ago | parent | prev | next [-]

I thought we would have almost AGI by now? https://x.com/elonmusk/status/1858747684972048695

d_sem 5 days ago | parent | prev | next [-]

My experience working in an automotive supplier suggest that Tesla engineers must have always knowns this and the real strategy was to provide the best ADAS experience with the cheapest sensor architecture. They certainly did achieved that goal.

There were aspirations that the bottom up approach would work with enough data, but as I learned about the kind of long tail cases that we solved with radar/camera fusion, camera-only seemed categorically less safe.

easy edge case: A self driving system cannot be inoperable due to sunlight or fog.

a more hackernew worthy consideration: calculate the angular pixel resolution required to accurately range and classify an object 100 meters away. (roughly the distance needed to safely stop if you're traveling 80mph) Now add a second camera for stereo and calculate the camera-to-camera extrinsic sensitivity you'd need to stay within to keep error sufficiently low in all temperature/road condition scenarios.

The answer is: screw that, I should just add a long range radar.

there are just so many considerations that show you need a multi-modality solution, and using human biology as a what-about-ism, doesn't translate to currently available technology.

brandonagr2 4 days ago | parent [-]

Tesla does not use stereo/binocular vision, that's not how humans perceive relative motion at that distance either, we would depend on perspective and parallax

briandw 5 days ago | parent | prev | next [-]

Lidar is the first thing brought up in these discussions. Lidar isn’t that great of a sensor. It does one thing well and that’s measure distance. A visual sensor can be measured along the axis of spatial resolution (x,y,z) temporal resolution(fps) and dynamic range(bit depth). You could add things like light frequency and phase etc. Lidar is quite poor in all of these except the spatial z dimension, measuring distance as mentioned before. Compared to a cheep camera the fps is very low, the spatial resolution in x and y is pathetic 128. in the vertical, higher horizontal but its not mega pixels. Finally the dynamic range is 1 bit(something is there or not). Lidars use near infrared and are just as susceptible to problems with natural fog (not the theatrical fog like in that Roper video) and rain. Multiple cameras can do good enough depth estimation with modern neural networks. But cameras are vastly better at making sense of the world. You can’t read a sign with lidar.

smilekzs 4 days ago | parent [-]

Lidars have been reporting per-point intensity values for quite a while. The dynamic range is definitely not 1 bit.

Many Lidar visualization software will happily pseudocolor the intensity channel for you. Even with a mechanically scanning 64-line Lidar you can often read a typical US speed limit sign at ~50 meter in this view.

AndrewKemendo 5 days ago | parent | prev | next [-]

Karpathy should be held liable for this (maybe less than Musk) but he should at least be considered persona non grata for pushing it.

It was his idea, his decision to build the architecture and he led the entire vision team during this.

Yet, he remains free from any of this fallout and still widely considered an ML god

https://youtu.be/3SypMvnQT_s?si=FDmyA6amWnDpMPEj

bmitc 5 days ago | parent [-]

Silicon Valley tech workers and companies are not known for their morals.

starchild3001 5 days ago | parent | prev | next [-]

Feels like Musk should step down from the CEO role. The company hasn’t really delivered on its big promises: no real self-driving, Cybertruck turned into a flop, the affordable Tesla never materialized. Model S was revolutionary, but Model 3 is basically a cheaper version of that design, and in the last decade there hasn’t been a comparable breakthrough. Innovation seems stalled.

At this point, Tesla looks less like a disruptive startup and more like a large-cap company struggling to find its next act. Musk still runs it like a scrappy startup, but you can’t operate a trillion-dollar business with the same playbook. He’d probably be better off going back to building something new from scratch and letting someone else run Tesla like the large company it already is.

xpe 5 days ago | parent | next [-]

This is not a heavily researched comment, but it seems to me that the Model 3 is relatively affordable, at least compared to available options at the time. It depends on your point of comparison: there is a lot of competition for sure. The M3 was successful to a good degree, don’t you think? I mean, we should put a number on it so we’re not just comparing feels. The Model Y sells well too, doesn’t at least until the DOGE insanity.

DoesntMatter22 5 days ago | parent | prev | next [-]

They went from no revenue to the 9th most valuable company in the world under him. No vehicle sales to having the best selling vehicles in the world.

They are still profitable, have very little debt and a ton of money into the bank.

Every company has hits and misses. Bezos started before Musk and still hasn't gotten his rockets into orbit.

sidcool 5 days ago | parent | prev | next [-]

Tesla haters tend to just move the goal posts.

derefr 5 days ago | parent | prev [-]

Daily reminder that Telsa is not — nor was ever intended to be — a car company. Tesla is fundamentally an "energy generation and storage" (battery/supercapacitor) company. Given Tesla's fundamentals (the types of assets they own, the logistics they've built out), the Powerwall and Megapack are closer to Tesla's core product than the cars are. (And they also make a bunch of other battery-ish things that have no consumer names, just MIL-SPEC procurement codes.)

Yes, right now car sales make up 78% of Tesla's revenue. But cars have 17% margins. The energy-storage division, currently at 10% of revenue, has more like 30% margins. And the car sales are falling as the battery sales ramp up.

The cars were always a B2C bootstrap play for Tesla, to build out the factories it needed to sell grid-scale batteries (and things like military UAV batteries) under large enterprise B2B contracts. Which is why Tesla is pushing the "car narrative" less and less over time, seeming to fade into B2C irrelevancy — all their marketing and sales is gradually pivoting to B2B outreach.

ratelimitsteve 3 days ago | parent | prev | next [-]

Bad angle shot: This thing where advertisers exploit the need to clarify ambiguity in order to smuggle in custom, private definitions of words that mean the opposite of the agreed-upon definitions of those same words is a problem. Calling something "full self-driving" when it doesn't drive by itself fully is lying even if you put in the fine print that "full" means "not full" and "self-driving" means "not driving by itself"

yieldcrv 5 days ago | parent | prev | next [-]

The lesson here is to wait for a chill SEC and friendly DOJ before you recant your fraudulent claims, because then they won’t be found to be fraudulent

comice 5 days ago | parent [-]

Wait for them? or buy them?

RyanShook 5 days ago | parent | prev | next [-]

Looking forward to the class action on this one…

greyface- 5 days ago | parent [-]

Tesla has binding arbitration that prohibits class actions.

maxlin 5 days ago | parent | prev | next [-]

Needs to be known that Fred Lambert pushes out so much negative Tesla press that its reasonable to say that he's on a crusade. And not a too fact-based one.

Like with this. No, Tesla hasn't communicated any as such. Everyone knows FSD is late. But Robotaxi shows it is very meaningfully progressing towards true autonomy. And for example crushed the competition (not literally) in a recent very high-effort test in avoiding crashes on a highway with obstacles that were hard to read for almost all the other systems: https://www.youtube.com/watch?v=0xumyEf-WRI

FireBeyond 5 days ago | parent [-]

> But Robotaxi shows it is very meaningfully progressing towards true autonomy.

What? They literally just moved the in car supervisor from the passenger seat to the driver seat. That's not a vote of confidence.

And I don't think you can glean anything. There are less than 20 Robotaxis in Austin, that spend their time giving rides to influencers so they can make YT videos where even they have scary moments.

iammjm 5 days ago | parent | prev | next [-]

This nazi-saluting manchild has been purposefully lying about self-driving for close to 10 years now, each year self-driving coming "next year". How is this legal and not false advertisement?

paradox460 5 days ago | parent | prev | next [-]

One of the shower thoughts I've had is why don't we start equipping cars with UWB tech. UWB can identify itself, two UWB nodes can measure short range distances between each other (around 30m) with fairly decent accuracy and directionality.

Sure, it wouldn't replace any other sensing tech, but if my car has UWB and another car has UWB, they can telegraph where they are and what their intentions are a lot faster and in a "cleaner" manner than using a camera to watch the rear indicator for illumination

AdmiralAsshat 5 days ago | parent | prev | next [-]

Kinda wish we as consumers had some way to fight back against this obvious bullshit, since lord knows the government won't do anything.

Like if a company comes out with a new transportation technology and calls it "teleportation", but in fact is just a glorified trebuchet, they shouldn't be allowed to use a generic term with a well-understood meaning fraudulently. But no, they'll just call it "Teleportation™" with a patented definition of their glorified trebuchet, and apparently that's fine and dandy.

I am still bitter about the hoverboard.

ChrisArchitect 5 days ago | parent | prev | next [-]

Earlier:

Tesla’s autonomous driving claims might be coming to an end [video]

https://news.ycombinator.com/item?id=45133607

dvh 5 days ago | parent | prev | next [-]

And stock is up $15

hedora 5 days ago | parent [-]

Gotta keep juicing it to get that $1T payout.

Am I the only one that noticed most of the targets are in nominal dollars, not inflation adjusted? Trump’s already prosecuting Fed leadership because they’re refusing to print money for him. Elon’s worked with him enough to understand where our monetary policy is headed.

asdff 5 days ago | parent | prev | next [-]

What I don't understand about this is that to my experience being driven around in friends teslas, its already there. It really seems like legalese vs technical capability. The damn thing can drive with no input and even find a parking spot and park itself. I mean where are we even moving the goalpost at this point? Because there's been some accidents its not valid? The question is how that compares to the accident rate of human drivers not that there should be an expectation of zero accidents ever.

AlotOfReading 5 days ago | parent | next [-]

The word "driving" has multiple, partially overlapping meanings. You're using it in a very informal sense to mean "I don't have to touch the controls much". Power to you for using whatever definitions you feel like.

Other people, most importantly your local driving laws, use driving as a technical term to refer to tasks done by the entity that's ultimately responsible for the safety of the entire system. The human remains the driver in this definition, even if they've engaged FSD. They are not in a Waymo. If you're interested in specific technical verbiage, you should look at SAE J3016 (the infamous "levels" standard), which many vehicle codes incorporate.

One of the critical differences between your informal definition is whether you can stop paying attention to the road and remain safe. With your definition, it's possible have a system where you're not "driving", but you still have a responsibility to react instantaneously to dangerous road events after hours of of inaction. Very few humans can reliably do that. It's not a great way to communicate the responsibilities people have in a safety-critical task they do every day.

breve 5 days ago | parent | prev [-]

Tesla set their own goal posts.

In 2016 Tesla claimed every Tesla car being produced had "the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver": https://web.archive.org/web/20161020091022/https://tesla.com...

It was a lie then and remains a lie now.

amanaplanacanal 5 days ago | parent | prev | next [-]

I wonder if this change came from the legal department after their loss in the lawsuit over that poor woman that was killed.

GUNHED_158 5 days ago | parent | prev | next [-]

The link contains malicious scripts.

diebeforei485 4 days ago | parent | prev | next [-]

This article makes no sense to me. They aren't changing the meaning of anything for consumers, it's only defining it for the purpose of the compensation milestone.

gcanyon 5 days ago | parent | prev | next [-]

Honest question: did Tesla in the past promise that FSD would be unsupervised? My based-on-nothing memory is that they weren't promising that you wouldn't have to sit in the driver's seat, or that your steering wheel would collect dust. Arguing against myself: they did talk about Teslas going off to park themselves and returning, but that's a fairly limited use case. Maybe in the robotaxi descriptions?

My memory was more that you'd be able to get into (the driver's seat of) your Tesla in downtown Los Angeles, tell it you want to go to the Paris hotel in Vegas, and expect generally not to have to do anything to get there. But not guaranteed nothing.

toast0 5 days ago | parent | next [-]

Is Full just a catch word for actually not full now?

Full Speed USB is 12Mbps, nobody wants a Full Speed USB data transfer.

Full Self Driving requires supervision. Clearly, even Tesla understands the implication of their name, or they wouldn't have renamed it Full Self Driving Supervised... They should probably have been calling it Supervised Self Driving since the beginning.

abduhl 5 days ago | parent | prev | next [-]

The [2016 Tesla promotional] video carries a tagline saying: “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

https://www.reuters.com/technology/tesla-video-promoting-sel...

scoopertrooper 5 days ago | parent | prev | next [-]

Musk promised unsupervised driving being right around the corner so many times it became a joke.

https://youtu.be/B4rdISpXigM

herbturbo 5 days ago | parent | prev [-]

In 2016 Musk said you’d be able to drive from LA to NYC without touching the steering wheel once “within 2 years”. He’s been making untrue statements about Tesla FSD for a decade.

jaggs 5 days ago | parent | prev | next [-]

One problem might be that American driving is not exactly... well great, is it? Roads are generally too straight and driving tests too soft. And for some weird reason, many US drivers seem to have a poor sense of situational awareness.

The result is it looks like many drivers are unaware of the benefits of defensive driving. Take that all into account and safe 'full self driving' may be tricky to achieve?

mlindner 5 days ago | parent | prev | next [-]

The title is rather misleading. They haven't given up on promise of autonomy...

rickdg 5 days ago | parent | prev | next [-]

I guess you can either go full waymo or full comma. The rest is just hype.

Ancalagon 4 days ago | parent | prev | next [-]

Guess living and working in the factory ain’t working out so well

Animats 5 days ago | parent | prev | next [-]

So what does it mean for Tesla's "Robotaxi"? Is that being shut down?

It's pathetic. The Austin Robotaxi demo had a "safety monitor" in the front passenger seat, with an emergency stop button. But there were failures where the safety driver had to stop the vehicle, get out, walk around the car, get into the drivers's seat, and drive manually. So now the "safety monitor" sits in the driver's seat.[1] It's just Uber now.

Do you have to tip the "safety monitor"?

And for this, Musk wants the biggest pay package in history?

[1] https://electrek.co/2025/09/03/tesla-moves-robotaxi-safety-m...

diebeforei485 4 days ago | parent [-]

The article is wrong. What happened is that they added some highway-capable ridehail vehicles, and only those vehicles have the safety person in the drivers seat. Frederic (the author) lives in Canada, he doesn't have access to any recent version of FSD.

mensetmanusman 4 days ago | parent | prev | next [-]

Our current infrastructure isn’t compatible with lidar. We were consulted to fix it, but governments have no idea how to approach this problem so it won’t happen for years.

jgalt212 5 days ago | parent | prev | next [-]

Given this move, like the rest of TSLA's inane investor base, I wholeheartedly support the potential $1 trillion pay package for Musk

MagicMoonlight 5 days ago | parent | prev | next [-]

There needs to be a class-action against Tesla. It’s blatant fraud.

luis_cho 5 days ago | parent | prev | next [-]

Fir a long time, I don’t think full self driving makes economic sense. Would this hurt car sells at long term?

ares623 5 days ago | parent | prev | next [-]

Most Honest Company (Sarcasm)

jacquesm 5 days ago | parent [-]

Fish rots from the head.

aurizon 5 days ago | parent | prev | next [-]

I was a fool's game from the start, with only negative aspects = what could possibly go wrong?

utyop22 5 days ago | parent [-]

Tesla's share price is all based on the Greater Fool Theory in the short run.

In the long run some of those promises might materialise. But who cares! Portfolio managers and retail investors want some juicy returns - share price volatility is welcomed.

sidcool 4 days ago | parent | prev | next [-]

Electrek has been anti Tesla for a long time now.

yencabulator 5 days ago | parent | prev | next [-]

Full (Limited)

olyellybelly 5 days ago | parent | prev | next [-]

Shock! Horror!

aamargulies 5 days ago | parent | prev | next [-]

I knew that FSD was nonsense when I tried to use Tesla's autopark feature under optimal conditions and it failed to park the car satisfactorily.

tempodox 5 days ago | parent | prev | next [-]

If you can’t reach the goal, move the goal posts!

moomin 5 days ago | parent | prev | next [-]

My 1993 Nissan has FSD. I can fully drive myself anywhere.

freerobby 5 days ago | parent | prev | next [-]

This is clickbait from a publication that's had it out for Tesla for nearly a decade.

Tesla is pivoting messaging toward what the car can do today. You can believe that FSD will deliver L4 autonomy to owners or not -- I'm not wading into that -- but this updated web site copy does not change the promises they've made prior owners, and Tesla has not walked back those promises.

The most obvious tell of this is the unsupervised program in operation right now in Austin.

qwerpy 5 days ago | parent | next [-]

Marketing choice of words aside, it's already really good now to the point that it probably does 95% of my driving. Once in a while it chooses the wrong lane and very rarely I will have to intervene, but it's always getting better. If they just called it "Advanced Driver Assist" or something, and politics weren't such an emotional trigger, it would be hailed as a huge achievement.

an0malous 5 days ago | parent | prev | next [-]

Great spin job. They didn’t lie, they’re just “pivoting their messaging”

panarky 5 days ago | parent | prev [-]

Can you find any statement in the article that is false?

resfirestar 5 days ago | parent | prev [-]

I don't read the article (besides the clickbait headline and the author's "take") as Tesla "giving up". No marketing is changing, no plans for taxi services are changing. This is about the company's famously captured board giving their beloved CEO flexibility on how to meet their ambitious-sounding targets, by using vague language in the definitions. This way if Tesla fails to hit 10 million $100/month FSD subscriptions, they could conceivably come up with a cheaper more limited subscription and get Elon his pay.