Remix.run Logo
loloquwowndueo 3 hours ago

By that logic it’s ok if the car slams itself against a concrete wall - just because it failed to stop in time doesn’t mean it wasn’t driving itself.

Self driving cars are supposed to obey the same rules as human drivers.

roenxi 2 hours ago | parent | next [-]

Well ... yes. By that logic it is the case. It applies to humans too - if a human slams their car into a concrete wall then the human was still driving the car. They did a bad job of it, but they were in fact driving.

A car being driven autonomously doesn't imply much about the quality of that driving. They're still going to make bad decisions and have accidents, just like humans do (a friend of mine died slamming their car into a tree). There is probably some minimum where we'd say that it isn't really driving because it can't do anything right, but modern self driving systems are past that.

RajT88 3 hours ago | parent | prev | next [-]

Tesla FSD is vulnerable to RoadRunner and Wile E. Coyote style tricks.

iknowstuff 18 minutes ago | parent | next [-]

it's not. that vid was using autopilot, not fsd, and subsequent videos using actual new FSD were fine

qingcharles 3 hours ago | parent | prev | next [-]

Fortunately the ACME products are flawed and subject to their own litigation, see e.g. Coyote vs. ACME (2026).

3 hours ago | parent | prev [-]
[deleted]
charcircuit an hour ago | parent | prev [-]

Both statements can be true. Human vs self driving cars is a different classification between good and bad driving. Humans can slam into a wall too.