Remix.run Logo
Workaccount2 9 hours ago

The problem is that if it drives well for 30,000 miles (unsupervised) on residential roads before steam rolling little billy on his bike, you will get a deluge of people who swear the system is excellent.

But when you incorporate that tech into a fleet doing 100k residential miles a week with no supervisor, your mowing down 12 kids a month.

ajross 9 hours ago | parent [-]

How many kids has it steamrollered? Obviously not 12 a month! Seems like this is an argument to be had with real numbers, no?

Workaccount2 8 hours ago | parent [-]

Zero, because there isn't, and hasn't been, a single unsupervised Tesla on the road.

There are no real numbers, because there are no real self-driving Teslas.

ajross 8 hours ago | parent [-]

So... that was exactly the point upthread. You're making a semantic argument over the proper definition for the word "real" when applied to autonomous vehicle systems. Nothing in this argument is actionable in any way. You can't conjure real dead kids, so you need to describe hypothetical ones. That's... yeah.

Nonetheless, our cars drive us around anyway. Neither they, nor us, actually care about hypothetical steamrollered kids.

Workaccount2 8 hours ago | parent [-]

The argument is that Tesla needs to be doing hundreds of thousands of miles without intervention to be trusted for robotaxis.

Most people using FSD don't come close to the mileage needed to get an idea of the safety level needed. If a Tesla robotaxi kills a kid, Tesla is done, and there won't be a coming back.

So Tesla actually needs millions of miles without critical intervention before they can confidently let these things en masse out on the streets.

A whole tesla fanboy meetup collectively will not have enough FSD miles to see something like that, but a robotaxi fleet will encounter it within a year.

spankalee 7 hours ago | parent | next [-]

I don't think Tesla will be done when they kill a kid. FSD already has killed people: https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe...

ajross 5 hours ago | parent | prev [-]

> Tesla actually needs millions of miles without critical intervention

So... agreed. I think that sounds like it's in the right ballpark.

Here's the thing though: your whole argument (sort of a para-freakout, really) hinges on this evidence not existing.

It's true that citing numbers from supervised cars isn't the same thing. It's not true to argue that it not being the same thing is (0) not at least a somewhat reasonable proxy for the evidence you want to see, (1) evidence that it doesn't or can't exist, (2) evidence for the opposite case (you seem to be claiming that the fact that it's supervised means that it must be), and in particular (3) evidence for suppression of contrary evidence, as some of your more conspiracy-leaning comments seem to imply.

Isn't the Occam's explanation here that, yeah, the car looks pretty damn safe as shown by billions of miles of travel? Why must you be going to the mattresses to argue against something that seems pretty common sense to me?

judahmeek an hour ago | parent [-]

Occam's Razor is a principle for comparing explanations, not for making predictions.

> It's not true to argue that it not being the same thing is (0) not at least a somewhat reasonable proxy for the evidence you want to see, (1) evidence that it doesn't or can't exist, (2) evidence for the opposite case (you seem to be claiming that the fact that it's supervised means that it must be), and in particular (3) evidence for suppression of contrary evidence, as some of your more conspiracy-leaning comments seem to imply.

Actually, (0) is true. Numbers from supervised FSD are not a reasonable proxy for unsupervised FSD, especially if accidents may be occurring immediately after FSD disengages.

(1) is also true, because if the evidence did exist, then Tesla would have already have publicized it.

(2) is also true, because if Tesla thought they could employ unsupervised cars like Waymo, then they would have already have done that.

As for (3), if Tesla had a reputation for transparency & honesty, then they could have provided additional data on accidents to show that accidents are not occurring right after FSD disengages at significant rates.