Remix.run Logo
AlotOfReading 3 hours ago

This is a common misconception. People tend to think driving is controlling the steering and pedals, so if FSD does those things it must be driving.

It's not. Driving is whatever has ultimate responsibility for the vehicle and its occupants. If a cop pulls you over while FSD is enabled, it's not Tesla who's paying the ticket. If FSD has an issue, you're the driver who has to respond.

Think of FSD as a very nice cruise control. You're still driving, even if you aren't touching the wheel.

tencentshill 10 minutes ago | parent | next [-]

The bottom line is, no one else is even remotely close to that experience for the driver, liable or not. Probably with good reason, as every other car company actually listen to their lawyers.

pdpi 2 hours ago | parent | prev | next [-]

Sort of how programming isn't the same as writing code — it also involves a bunch of other thing like all the design and planning work.

zadikian 2 hours ago | parent | prev | next [-]

It's a common misconception because the thing is called "full self driving."

charcircuit an hour ago | parent | prev [-]

So if the law says that a human in the car has to be responsible then it is impossible for a self driving car to exist. I do not think tying the definition to legal liability is right.

I don't see why self driving couldn't just be steering and pedals. It would be pretty limiting but it would be able to drive itself in a circle at least.

Retric an hour ago | parent [-]

No. The law allows passengers in self driving Taxi not to be responsible. Including Taxi operated by Tesla.

Here Tesla makes it clear to people who turn on “Full self driving” the driver must maintain supervision and thus responsibility. As such it’s Tesla’s choice that they aren’t selling self driving cars.

It wouldn’t be such a big deal if some random engineer said they’d eventually do X, but when it’s the CEO repeatedly saying the same across many public appearances that’s as binding as a Super Bowl advertisement.