| ▲ | zelphirkalt 2 hours ago |
| Since lidar has distance information and cameras do not, it was always a ridiculous idea by a certain company to use cameras only. Lidar using cars are going to replace at least the ones that don't make use of this obvious answer to obstacle detection challenges. |
|
| ▲ | galangalalgol an hour ago | parent | next [-] |
| The reasoning is cynical but sound. If the system uses only the sensing modes people have, it will make the mistakes people do. If a jury thinks "well I could have done that either!" You win. It doesn't matter if your system has fewer accidents if some of the failure modes are different than human ones, because the jury will think "how could it not figure that out?" |
| |
| ▲ | bluGill 32 minutes ago | parent | next [-] | | Until a lawyer points out other cars see that. My car already has various sensors and in manual driving sounds alarms if there is a danger I seem not to have noticed. (There are false alarms - but most of the type I did notice and probably should have left more safety margin even though I wouldn't hit it) also regulators gather srastics and if cars with something do better they will mandate it. | |
| ▲ | small_model 23 minutes ago | parent | prev | next [-] | | Very recent issue with Waymo https://dmnews.co.uk/waymo-robotaxi-spotted-unable-to-cross-.... This is 17 years after they bet the farm on LIDAR, with no signs its ever going to be cost effective or that it's better than multiple cameras, with millisecond reaction 360 degrees, that never gets tired, drunk, distracted, and also has other cheaper sensors and NN trained on Billions or real world data. | | |
| ▲ | RobotToaster 6 minutes ago | parent | next [-] | | That's an example of it failing safe. I'd rather it did that than drive me into a sinkhole because it thought it was a puddle. | |
| ▲ | idiotsecant 8 minutes ago | parent | prev [-] | | >A vehicle got stuck trying to figure out an obstacle so sensors with less information are better than sensors with more information. |
| |
| ▲ | estearum an hour ago | parent | prev [-] | | I don't think that's the reasoning. The reasoning was simply that LIDAR was (and incorrectly predicted to always be) significantly more expensive than cameras, and hypothetically that should be fine because, well, humans drive with only two eyes. Musk miscalculated on 1) cost reduction in LIDAR and 2) how incredible the human brain is compared to computers. Having similar sensors certainly doesn't guarantee your accidents look the same, so I don't think your logic is even internally sound. | | |
| ▲ | klabb3 13 minutes ago | parent | next [-] | | > Musk miscalculated on 1) cost reduction in LIDAR and 2) how incredible the human brain is compared to computers. And, less excusable, ignorant of how incredible human eyes are compared to small sensor cameras. In particular high DR in low light, with fast motion. Every photographer knows this. | |
| ▲ | mytailorisrich 31 minutes ago | parent | prev | next [-] | | IMHO not using lidars sounds like a premature optimisation and a complication, with a level of hubris. This is a difficult problem to solve and perhaps a pragmatic approach was/is to make your life as simple as possible to help get to a fully working solution, even if more expensive, then you can improve cost and optimise. | |
| ▲ | cyanydeez an hour ago | parent | prev | next [-] | | There certainly is a pretty on going miscalculation regarding human intelligence, and consrquentially, empathy. | |
| ▲ | szundi an hour ago | parent | prev [-] | | [dead] |
|
|
|
| ▲ | nlitened 20 minutes ago | parent | prev | next [-] |
| As I understand, lidars don't work well in rain/snow/fog. So in the real world, where you have limited resources (research and production investment, people talent, AI training time and dataset breadth, power consumption) that you could redistribute between two systems (vision and lidar), but one of the systems would contradict the other in dangerous driving conditions — it's smarter to just max out vision and ignore lidar altogether. |
| |
| ▲ | zozbot234 8 minutes ago | parent | next [-] | | Why does this matter? You have to slow down in rain/snow/fog anyway, so only having cameras available doesn't hurt you all that much. But then in clear weather lidar can only help. | |
| ▲ | RobotToaster 12 minutes ago | parent | prev | next [-] | | > lidars don't work well in rain/snow/fog. Neither do cameras, or eyeballs. | |
| ▲ | zemvpferreira 16 minutes ago | parent | prev | next [-] | | Limited resources? Billions per year are being thrown at the base technology. We have the capital deployed to exhaust every path ten times over. | |
| ▲ | heisenbit 13 minutes ago | parent | prev | next [-] | | The Swiss cheese model would like to disagree. | |
| ▲ | idiotsecant 4 minutes ago | parent | prev [-] | | This is silly. Cameras are cheap. Have both. Sensors that sense differently in different conditions is not an exotic new problem. The kalman filter has existed for about a billion years and machine learning filters do an even better job. |
|
|
| ▲ | mgoetzke 15 minutes ago | parent | prev | next [-] |
| considering cameras can create reliable enough distance measurements AND also handle all the color reception needed for legally driving roads it was always a ridiculous idea by a certain set of people that lidar is necessary. |
| |
| ▲ | tsimionescu 6 minutes ago | parent | next [-] | | No, cameras cannot create reliable distance measurements in real-world conditions. Parallax is not a great way to measure distance for fast, unpredictably moving objects (such as cars on the road). And dirt or misalignment can significantly reduce accuracy compared to lab conditions. Note that humans do not rely strictly on our eyes as cameras to measure distances. There is a huge amount of inference about the world based on our internal world models that goes into vision. For example, if you put is in a false-perspective or otherwise highly artifical environment, our visual acuity goes down significantly; conversely, people with a single eye (so no parallax-based measurement ability) still have quite decent depth perception compared to what you'd naively expect. Not to mention, our eyes are kept very clean, and maintain their alignment to a very high degree of precision. | |
| ▲ | throwa356262 4 minutes ago | parent | prev [-] | | There are tons of evidence showing that cameras are alone are not safe enough and even Tesla has realized that removing lidar to save cost was a mistake. |
|
|
| ▲ | spyder 41 minutes ago | parent | prev | next [-] |
| Yea, even in the case they could match human level stereo depth perception with AI, why would they say "no" to superhuman lidar capabilities. Cost could be a somewhat acceptable answer if there wouldn't be problems with the camera only approach but there are still examples of silly failures of it.
And if I remember correctly they also removed their other superhuman radar in their newer models, the one which in certain conditions was capable of sensing multiple cars ahead by bouncing the signal below other cars. |
|
| ▲ | wasmainiac an hour ago | parent | prev | next [-] |
| Just say Tesla, why censor yourself. |
| |
| ▲ | zelphirkalt 16 minutes ago | parent [-] | | I have a suspicion here on HN. When criticizing big tech, especially Google and FB, at a certain time of the day a specific cohort comes online and downvotes. Suspiciously, that is a time when one could conclude, that now people in the US start working or come online. Either fanboys, employees or an organized group of users trying to silence big tech criticism. I have no proof of course and it might be coincidence, or just difference of mindset between US citizens and Europe citizens. It happened a few times already and to me looks sus. But if they actually read and not just ctrl+f <company name>, then of course not writing the company name, but hinting at it in an obvious way is no more helpful either. |
|
|
| ▲ | Someone 32 minutes ago | parent | prev [-] |
| > Since lidar has distance information and cameras do not, it was always a ridiculous idea by a certain company to use cameras only Human eyes do not have distance information, either, but derive it well enough from spatial (by ‘comparing’ inputs from 2 eyes) or temporal parallax (by ‘comparing’ inputs from one eye at different points in time) to drive cars. One can also argue that detecting absolute distance isn’t necessary to drive a car. Time to-contact may be more useful. Even only detecting “change in bearing” can be sufficient to avoid collision (https://eoceanic.com/sailing/tips/27/179/how_to_tell_if_you_...) Having said that, LiDAR works better than vision in mild fog, and if it’s possible to add a decent absolute distance sensor for little extra cost, why wouldn’t you? |
| |
| ▲ | tsimionescu 2 minutes ago | parent | next [-] | | Human/animal vision uses way more than parallax to judge distances and bearings - it uses a world model that evolved over millions of years to model the environment. That's why we can get excellent 3D images from a 2D screen, and also why our depth perception can be easily tricked with objects of unexpected size. Put a human or animal in an abstract environment with no shadows and no familiar objects, and you'll see that depth perception based solely on parallax is actually very bad. | |
| ▲ | idiotsecant 3 minutes ago | parent | prev | next [-] | | Let me know when you have a camera package with human eye equivalency. | |
| ▲ | dumbfounder 20 minutes ago | parent | prev | next [-] | | I don’t like the comparison between humans and humans. Humans don’t travel around at 100mph in packs of other humans. Why not use every sensor type at our disposal if it gives us more info to make decisions? Yes I understand it’s more complicated, but we figure stuff out. | |
| ▲ | larsnystrom 18 minutes ago | parent | prev [-] | | Human eyes are much better than cameras at dealing with dynamic range. They’re also attached to a super-computer which has been continuously trained for many years to determine distances and classify objects. |
|