Remix.run Logo
galangalalgol 3 hours ago

The reasoning is cynical but sound. If the system uses only the sensing modes people have, it will make the mistakes people do. If a jury thinks "well I could have done that either!" You win. It doesn't matter if your system has fewer accidents if some of the failure modes are different than human ones, because the jury will think "how could it not figure that out?"

bluGill 2 hours ago | parent | next [-]

Until a lawyer points out other cars see that. My car already has various sensors and in manual driving sounds alarms if there is a danger I seem not to have noticed. (There are false alarms - but most of the type I did notice and probably should have left more safety margin even though I wouldn't hit it)

also regulators gather srastics and if cars with something do better they will mandate it.

estearum 2 hours ago | parent | prev | next [-]

I don't think that's the reasoning.

The reasoning was simply that LIDAR was (and incorrectly predicted to always be) significantly more expensive than cameras, and hypothetically that should be fine because, well, humans drive with only two eyes.

Musk miscalculated on 1) cost reduction in LIDAR and 2) how incredible the human brain is compared to computers.

Having similar sensors certainly doesn't guarantee your accidents look the same, so I don't think your logic is even internally sound.

seanmcdirmid an hour ago | parent | next [-]

Sensor fusion is also hard to get right, since you still need cameras you have to fuse the two information streams. Thats mainly a software problem and companies like Waymo have done it, but Tesla was having trouble with it earlier, and if you don’t do it right, your self driving system can be less reliable.

klabb3 an hour ago | parent | prev | next [-]

> Musk miscalculated on 1) cost reduction in LIDAR and 2) how incredible the human brain is compared to computers.

And, less excusable, ignorant of how incredible human eyes are compared to small sensor cameras. In particular high DR in low light, with fast motion. Every photographer knows this.

venusenvy47 40 minutes ago | parent [-]

And also ignorant about how those two eyes have binocular vision, adjustable positions, and can look in multiple mirrors for full spatial awareness.

cobbzilla 7 minutes ago | parent [-]

There are good arguments but this isn’t one. Many humans (like me!) drive fine without binocular vision. And the cars have many cameras all around, with wide angle lenses that are watching everything all the time, when a human can only focus in one direction at a time.

cyanydeez 2 hours ago | parent | prev | next [-]

There certainly is a pretty on going miscalculation regarding human intelligence, and consrquentially, empathy.

mytailorisrich 2 hours ago | parent | prev | next [-]

IMHO not using lidars sounds like a premature optimisation and a complication, with a level of hubris.

This is a difficult problem to solve and perhaps a pragmatic approach was/is to make your life as simple as possible to help get to a fully working solution, even if more expensive, then you can improve cost and optimise.

lazide an hour ago | parent | prev | next [-]

Eh, I think ‘miscalculation’ might be giving too much credit about good intentions.

He wanted (needed?) to get on the hype train for self driving to pump up the stock price, knew that at the time there was zero chance they could sell it at the price point lidar required at the time - or even effective other sensors (like radar) - and sold it anyway at the price point that people would buy it at, even though it was not plausibly going to ever work at the level that was being promised.

There is a word for that. But I’m sure there are many lawyers that will say it was ‘mere fluffery’ or the like. And I’m sure he’ll get away with it, because more than enough people are complicit in the mess.

Miscalculation assumes there was a mistake somewhere, but near as I can tell, it is playing out as any reasonable person expected it too, given what was known at the time.

estearum an hour ago | parent [-]

I think Musk is really not as smart as he thinks he is and this specific thing was probably an earnest mistake. Lots of other fraudulent stuff going on though of course!

szundi 2 hours ago | parent | prev [-]

[dead]

small_model 2 hours ago | parent | prev | next [-]

Very recent issue with Waymo https://dmnews.co.uk/waymo-robotaxi-spotted-unable-to-cross-.... This is 17 years after they bet the farm on LIDAR, with no signs its ever going to be cost effective or that it's better than multiple cameras, with millisecond reaction 360 degrees, that never gets tired, drunk, distracted, and also has other cheaper sensors and NN trained on Billions or real world data.

jeltz an hour ago | parent | next [-]

Tesla does not handle rain well either. This is not a LIDAR problem, it is a problem with self driving cars in general.

RobotToaster an hour ago | parent | prev | next [-]

That's an example of it failing safe. I'd rather it did that than drive me into a sinkhole because it thought it was a puddle.

small_model an hour ago | parent [-]

Ok so Waymo is useless in the rain then, kind of limiting. But at least that 0.000000000001% times it actually is a sinkhole you won't damage the bumper.

criley2 an hour ago | parent [-]

I'd rather a Waymo be useless in the rain rather than a Tesla be actively dangerous and likely to kill me.

Tesla ""autopilot"" fatalities: 65

Waymo fatalities: 0

seanmcdirmid an hour ago | parent [-]

Autopilot isn’t full self driving (FSD), most cars these ship with smart cruise control (what autopilot basically is). Do you have fatality statistics for FSD?

If we are just talking about smart cruise control, most cars are using cameras and radar, not lidar yet. But Tesla is special since it doesn’t even use radar for its smart cruise control implementation, so that could make it less safe than other new cars with smart cruise control, but Autopilot was never competing with Waymo.

WarmWash an hour ago | parent | prev | next [-]

There is also a report from the same flooding in LA of a Waymo driving into a flooded road and getting stuck.

They might have flipped a switch after that, causing this.

an hour ago | parent | prev | next [-]
[deleted]
veltas an hour ago | parent | prev | next [-]

Dude that's not a 'puddle' as the article claims, that's a body of water that it's not even visually obvious whether it's safe to drive through. Maybe I'm a bad driver but I'd hesitate to drive through that in a small car either.

idiotsecant an hour ago | parent | prev [-]

>A vehicle got stuck trying to figure out an obstacle so sensors with less information are better than sensors with more information.

xnx an hour ago | parent | prev | next [-]

This is a new and flawed rationale that I haven't heard before. Tesla cameras are worse (lower resolution, sensitivity, and dynamic range) than human eyes and don't have "ears" (microphones).

lazide an hour ago | parent | prev [-]

Pretty hard to do if your whole selling point is ‘better and safer than human’ however?