| ▲ | chippiewill 5 days ago |
| As someone who worked in this space, you are absolutely right, but also kind of wrong - at least in my opinion. The cold hard truth is that LIDARs are a crutch, they're not strictly necessary. We know this because humans can drive without a LIDAR, however they are a super useful crutch. They give you super high positional accuracy (something that's not always easy to estimate in a vision-only system). Radars are also a super useful crutch because they give really good radial velocity. (Little anecdote, when we finally got the Radars working properly at work it made a massive difference to the ability for our car to follow other cars, ACC, in a comfortable way). Yes machine learning vision systems hallucinate, but so do humans. The trick for Tesla would be to get it good enough to where it hallucinates less than humans do (they're nowhere near yet - human's don't hallucinate very often). It's also worth adding that last I checked the state of the art for object detection is early fusion where you chuck the LIDAR and Radar point clouds into a neural net with the camera input so it's not like you'd necessarily have the classical methods guardrails with the Lidar anyway. Anyway, I don't think Tesla were wrong to not use LIDAR - they had good reasons to not go down that route. They were excessively expensive and the old style spinning LIDARs were not robust. You could not have sold them on a production car in 2018. Vision systems were improving a lot back then so the idea you could have a FSD on vision alone was plausible. |
|
| ▲ | raincole 5 days ago | parent | next [-] |
| > The cold hard truth is that LIDARs are a crutch The hard truth is there is no reason to limit machines to only the tools humans are biologically born with. Cars always have crutches that humans don't possess. For example, wheels. |
| |
| ▲ | dcchambers 5 days ago | parent | next [-] | | Exactly. In a true self-driving utopia, all of the cars are using multiple methods to observe the road and drive (vision, lidar, GPS, etc) AND they are all communicating with each other silently, constantly, about their intentions and status. Why limit cars to what humans can do? | |
| ▲ | mensetmanusman 5 days ago | parent | prev | next [-] | | The hard truth is you are balancing cost benefit curves. | |
| ▲ | daveguy 5 days ago | parent | prev | next [-] | | The "lidar is a crutch" excuse is such a fraud. Musk is doing it so he can make more money, because it's cheaper. Thats it. Just another sociopath billionaire cutting corners at the expense of safety. The reason this is clear is because, except for a brief period in late 2022, Teslas have included some combination of radar and ultrasonic sensors. [0] [0] https://en.m.wikipedia.org/wiki/Tesla_Autopilot_hardware | |
| ▲ | profunctor 5 days ago | parent | prev [-] | | The reason is cost, LIDAR is expensive. | | |
| ▲ | kibwen 5 days ago | parent | next [-] | | This information is out of date. LIDAR costs are 10x less than they were a decade ago, and still falling. Turns out, when there's demand for LIDAR in this form factor, people invest in R&D to drive costs down and set up manufacturing facilities to achieve economies of scale. Wow, who could have predicted this‽ | |
| ▲ | throwaway31131 5 days ago | parent | prev | next [-] | | Cost is relative. LIDAR maybe be expensive relative to a camera or two but it’s very inexpensive compared to hiring a full time driver. Crashes aren’t particularly cheap either. Neither are insurance premiums. | |
| ▲ | DennisP 5 days ago | parent | prev | next [-] | | Huawei has a self-driving system that uses three lidars, which cost $250 each (plus vision, radar, and ultrasound). It appears to work about as well as FSD. Here's the Out of Spec guys riding around on it in China for an hour: https://www.youtube.com/watch?v=VuDSz06BT2g | | |
| ▲ | mensetmanusman 5 days ago | parent [-] | | Huawei received over $1 billion in grants from the Chinese government in 2023. Western countries might not be smart enough to keep R&D because Wall Street sees it as a cost center. | | |
| |
| ▲ | ModernMech 5 days ago | parent | prev | next [-] | | You know what used to be expensive? Cameras. Then people started manufacturing them for mass market and cost when down. You know what else used to be expensive? Structured light sensors. They cost $$$$ in 2009. Then Microsoft started manufacturing the Kinect for a mass market, and in 2010 price went down to $150. You know what's happened to LIDAR in the past decade? You guessed it, costs have come massively down because car manufacturers started buying more, and costs will continue to come down as they reach mass market adoption. The prohibitive cost for LIDAR coming down was always just a matter of time. A "visionary" like Musk should have been able to see that. Instead he thought he could outsmart everyone by using a technology that was not suited for the job, but he made the wrong bet. | | |
| ▲ | jqpabc123 5 days ago | parent [-] | | but he made the wrong bet. This should be expected when someone who is *not* an experienced engineer starts making engineering decisions. |
| |
| ▲ | zbrozek 5 days ago | parent | prev | next [-] | | It's not 2010 anymore. They will asymptotically reach approximately twice the price of a camera, since they need both a transmit and receive optical path. Right now the cheapest of the good LiDARs are around 3-4x that. So we're getting close, and we're already within the realm large-scale commercial viability. | |
| ▲ | uoaei 5 days ago | parent | prev [-] | | That's ok, they're supposed to be. That's no excuse to rush a bad job. | | |
| ▲ | revnode 5 days ago | parent [-] | | The point of engineering is to make something that’s economically viable, not to slap together something that works. Making something that works is easy, making something that works and can be sold at scale is hard. | | |
| ▲ | uoaei 5 days ago | parent | next [-] | | That's not engineering, that's industry. It's important to distinguish the two. | | |
| ▲ | revnode 5 days ago | parent [-] | | Engineering only exists within industry. Everything else is a hobby. | | |
| ▲ | uoaei 5 days ago | parent [-] | | That's simply not true. Engineering can exist outside industry. "Stuff costs money" is not a governing aspect of these kinds of things. FOSS is the obvious counterexample to your absurdly firm stance, but so are many artistic pursuits that use engineering techniques and principles, etc. | | |
| ▲ | revnode 4 days ago | parent [-] | | Industry includes FOSS and artistic endeavors, anything that’s done professionally. My intent was to exclude research efforts, which is fundamentally different from engineering, which is a practical concern and not a “get it to just work” concern. | | |
| ▲ | uoaei 4 days ago | parent [-] | | That's an interesting question, the question of whether engineering per se is strictly pragmatic. I personally think drawing a hard line between research and engineering is a misstep and relies too heavily on a bureaucratic kind of metaphysics. |
|
|
|
| |
| ▲ | waldarbeiter 5 days ago | parent | prev [-] | | If it would be easy there would already be a car costing a few million that few can afford but that has solved AD. But there isn't. | | |
| ▲ | revnode 5 days ago | parent [-] | | There is no market for such a thing. At that price point, you get a personal chauffeur. That’s what rich people do and he can do stuff that a self driving system never can. | | |
| ▲ | tialaramex 5 days ago | parent [-] | | And the rich people who don't want a chauffeur like driving the car. They will buy a $10M car no problem, but they want driving that car to be fun because that's what they were paying for. They don't want you to make the driving more automatic and less interesting. |
|
|
|
|
|
|
|
| ▲ | hudon 5 days ago | parent | prev | next [-] |
| > they're not strictly necessary. We know this because humans can drive without a LIDAR and propellers on a plane are not strictly necessary because birds can fly without them? The history of machines show that while nature can sometimes inspire the _what_ of the machine, it is a very bad source of inspiration for the _how_. |
| |
| ▲ | ethbr1 5 days ago | parent [-] | | Turns out intelligent design is quicker than evolutionary algorithms. ;) |
|
|
| ▲ | goalieca 5 days ago | parent | prev | next [-] |
| > The cold hard truth is that LIDARs are a crutch, they're not strictly necessary. We know this because humans can drive without a LIDAR, however they are a super useful crutch. Crutch for what? AI does not have human intelligence yet and let’s stop pretending it does. There is no shame in that as the word crutch implies. |
| |
| ▲ | spot5010 5 days ago | parent | next [-] | | I've never understood the argument against lidars (except cost, but even that you can argue can come down). If a sensor provides additional data, why not use it? Sure, humans can drive withot lidars, but why limit the AI to using human-like sensors? Why even call it a crutch? IMO It's an advantage over human sensors. | | |
| ▲ | bayindirh 5 days ago | parent | next [-] | | > Sure, humans can drive without LIDARs... That's because our stereoscopic vision has infinitely more dynamic range, focusing speed and processing power w.r.t. a computer vision system. Periphery vision is very good at detecting movement, and central view can process tremendous amount of visual data without even trying. Even a state of the art professional action camera system can't rival our eyes in any of these categories. LIDARs and RADARs are useful and shall be present in any car. This is the top reason I'm not considering a Tesla. Brain dead insistence on cameras with small sensors only. | | |
| ▲ | iknowstuff 5 days ago | parent [-] | | their cams have better dynamic range than your eyes, given they can just run multiexposure and u gotta squint for sunlight. focal point is infinite for driving. You’re not considering them even though they have the best adas on the market lmao suit yourself https://m.youtube.com/watch?v=2V5Oqg15VpQ |
| |
| ▲ | IgorPartola 5 days ago | parent | prev [-] | | I don’t work in this field so take the grain of salt first. Quality of additional data matters. How often does a particular sensor give you false positives and false negatives? What do you do when sensor A contradicts sensor B? “3.6 roentgen, not great, not terrible.” | | |
| ▲ | giveita 5 days ago | parent [-] | | You can say that about human hearing and balance. What if they conflict with visual? We are good at figuring it out. | | |
| ▲ | ben_w 5 days ago | parent | next [-] | | We throw up, an evolved response because that conflict is a symptom of poisonous plants messing with us. | |
| ▲ | IgorPartola 5 days ago | parent | prev [-] | | Humans can be confused in a number of ways. So can AI. The difference is that we know pretty well how humans get confused. AI gets confused in novel and interesting ways. | | |
| ▲ | giveita 5 days ago | parent [-] | | Does removing a sense help in that regard (for car driving?). Probably comes down to lidar (and Ai) failure modes. | | |
| ▲ | IgorPartola 5 days ago | parent [-] | | I suspect it helps engineering the system. If you have 30 difference sensors, how do you design a system that accounts for seemingly random combinations of them disagreeing with an observation in real time if a priori you don’t know the weight of their observation in that particular situation? For humans for example you know that in most cases seeing something in a car is more important than smelling something. But what if one of your eyes sees a pedestrian and another sees a shadow of a bird? Also don’t forget that as a human you can move your head any which way, and also draw on your past experiences driving in that area. “There is always an old man crossing the road at this intersection. There is a school nearby so there might be kids here at 3pm.” That stuff is not as accessible to a LIDAR. |
|
|
|
|
| |
| ▲ | lazide 5 days ago | parent | prev [-] | | I think they meant crutch for the AI so they could pretend for investors that AGI is right around the corner haha |
|
|
| ▲ | jfim 5 days ago | parent | prev | next [-] |
| LIDARs have the advantage that they allow detecting solid objects that have not been detected by a vision-only system. For example, some time ago, a Tesla crashed into an overturned truck, likely because it didn't detect it as an obstacle. A system that's only based on cameras is only as good as its ability to recognize all road hazards, with no fall back if that fails. With LIDAR, the vehicle might not know what's the solid object in front of the vehicle using cameras, but it knows that it's there and should avoid running into it. |
| |
| ▲ | sandworm101 5 days ago | parent [-] | | Solid objects that arent too dark or too shiny. Lidar is very bad at detecing mirrored surfaces or non-reflecting structures that absorb the paticular frequency in use. The back ends of trucks hauling liquid are paticularly bad. Block out the bumper/wheels, say by a slight hill, and that polished cone is invisible to lidar. | | |
| ▲ | bayindirh 5 days ago | parent | next [-] | | Add one or a couple of RADAR(s), too. European cars use this one weird trick to enable tons of features without harming people or cars. | |
| ▲ | UltraSane 5 days ago | parent | prev [-] | | LIDAR works be measuring the time it takes for light to return so I don't understand how a object can be too reflective. Objects that absorb the specific wavelength the LIDAR uses is an obvious problem. | | |
| ▲ | sandworm101 5 days ago | parent [-] | | Too reflective, like a flat mirror, will send the light off in a random direction rather than back as the detector. Worse yet, things like double reflections can result in timing errors as some of the signal follows a longer path. You want a target that is nicely reflective but not so shiny that you get any double reflections. The ideal is a matte surface painted the same color as the laser. | | |
| ▲ | UltraSane 5 days ago | parent [-] | | Ah it relies on diffuse reflections to guarantee some light returns to the sensor but specular reflections mean none is returned. This is a good example of why sensor fusion is good. |
|
|
|
|
|
| ▲ | lazide 5 days ago | parent | prev | next [-] |
| The big promise of autonomous self-driving was that it would be done safer than humans. The assumption was that with similar sensors (or practically worse - digital cameras score worse than eyeballs in many concrete metrics), ‘AI’ could be dramatically better than humans. At least with Tesla’s experience (and with some fudging based on things like actual fatal accident data) it isn’t clear that is actually what is possible. In fact, the systems seem to be prone to similar types of issues that human drivers are in many situations - and are incredibly, repeatedly, dumb in some situations many humans aren’t. Waymo has gone full LiDAR/RADAR/Visual, and has had a much better track record. But their systems cost so much (or at least used to), that it isn’t clear the ‘replace every driver’ vision would ever make sense. And that is before the downward pressure on the labor market started to happen post-COVID, which hurts the economics even more. The current niche of Taxis kinda makes sense - centrally maintained and capitalized Taxis with outsourced labor has been a viable model for a long time, it lets them control/restrict the operating environment (important to avoid those bad edge cases!), and lets them continue to gather more and more data to identify and address the statistical outliers. They are still targeting areas with good climates and relatively sane driving environments because even with all their models and sensors, heavy snow/rain, icy roads, etc. are still a real problem. |
| |
| ▲ | tialaramex 5 days ago | parent [-] | | This whole "But Waymo can't work in bad climates" thing is very dubious. At some point it is too dangerous to be driving an automobile. "But Waymo should also be dangerous" is the wrong lesson. When the argument was Phoenix is too pleasant I could buy that. Most places aren't Phoenix. But SF and LA are both much more like a reasonable place other humans live. It rains, but not always, it's misty, but not always. Snow I do accept as a thing, lots of places humans live have some snow, these cities don't really have snow. However for ice when I watch one of those "ha, most drivers can make this turn in the ice" videos I'm not thinking "I bet Waymo wouldn't be able to do this" I'm thinking "That's a terrible idea, nobody should be attempting it". There's a big difference between "Can it drive on a road with some laying snow?" and "Can it drive on ice?". | | |
| ▲ | lazide 5 days ago | parent [-] | | You know how I can tell you haven’t actually lived in a bad climate? Both SF and LA climates are super cushy compared to say, Northern Michigan. Or most of the eastern seaboard. Or even Kansas, Wyoming, etc. in the winter. In those climates, if you don’t drive in what you’re calling ‘nobody should be attempting it’ weather, you - starve to death in your house over the winter. Because many months are just like that. Self driving has a very similar issue with the vast majority of, say, Asia. Because similarly “this is crazy, no one should be driving like this conditions” is the norm. So if it can’t keep up, it’s useless. Eastern and far Northern Europe has a lot of kinda similar stuff going on. Self driving cars are easy if you ignore the hard parts. In India, I’ve had to deal with Random Camel, missing (entire) road section that was there yesterday, 5 different cars in 3 lanes (plus 3 motorcycles) all at once, many cattle (and people) wandering in the road at day and night, and the so common it’s boring ‘people randomly going the wrong way on the road’. If you aren’t comfortable bullying other drivers sometimes to make progress or avoid a dangerous situation, you’re not getting anywhere anytime soon. All in a random mix of flooding, monsoon rain, super hot temperatures, construction zones, fog, super heavy fireworks smoke, etc. etc. Hell, even in the US I’ve had to drive through wildfires and people setting off fireworks on the road (long story, safety reasons). The last thing I would have wanted was the car freezing or refusing. Is that super safe? Not really. But life is not super safe. And a car that won’t help me live my life is useless to me. Such an AI would of course be a dangerous asshole on, say, LA roads, of course. Even more than the existing locals. | | |
| ▲ | tialaramex 5 days ago | parent [-] | | This idea that they're somehow ignoring the hard parts is very silly. The existing human drivers in San Francisco manage to kill maybe 20 or so people per year so apparently it's not so "easy" that the human drivers can do it without killing anybody. I live in the middle of a city, so, no, in terrible weather just like great weather I walk to the store, no need to "starve to death" even if conditions are too treacherous for people to sensibly drive cars. Because I'm an old man, and I used to live somewhere far from a city, I have had situations where you can't use a car to go fetch groceries because even if you don't care about safety the car can't go up an icy hill, it loses traction, gravity takes over, you slide back down (and maybe wreck the car). | | |
| ▲ | lazide 5 days ago | parent [-] | | So why do you think they’re only those cities? Because I’m hearing nothing from you that goes beyond ‘nuh uh’ so far. Because as an old man who has actually lived in all these places - and also has ridden in Waymos before and has had friends on the Waymo team in the past, your comments seem pretty ridiculous. | | |
| ▲ | tialaramex 5 days ago | parent [-] | | Unlike Phoenix the choice of SF and LA seems to me like a PR choice. SF is where lots of tech nerds live and work, LA is one half of the country's media. I'd imagine that today if you're at all interested in this stuff and live in LA or SF you have ridden Waymo whereas when it was in a Phoenix suburb that's a very niche thing to go do unless you happened to live there. A lot of the large population centres in the US are in these what you're calling "super cushy" zones where there's not much snow let alone ice. More launches in cities in Florida, Texas, California will address millions more people but won't mean more ice AFAIK. So I guess for you the most interesting announcement is probably New York, since New York certainly does have real snow. 2026 isn't that long, although I can imagine that maybe a President who thinks he's entitled to choose the Mayor of New York could mess that up. As to the "But people in some places are crazy drivers" I saw that objection from San Francisco before it was announced. "Oh they'll never try here, nobody here drives properly. Can you imagine a Waymo trying to move anywhere in the Mission?". So I don't have much time for that. |
|
|
|
|
|
|
| ▲ | davidhs 5 days ago | parent | prev | next [-] |
| > Yes machine learning vision systems hallucinate, but so do humans. When was the last time you had full attention on the road and a reflection of light made you super confused and suddenly drive crazy? When was the last time you experienced objects behaving erratically around you, jumping in and out of place, and perhaps morphing? |
| |
| ▲ | hodgesrm 5 days ago | parent | next [-] | | Well there is strong anecdotal evidence of exactly this happening. We were somewhere around Barstow on the edge of the desert when the drugs began to take hold. I remember saying something like, “I feel a bit lightheaded; maybe you should drive . . .”And suddenly there was a terrible roar all around us and the sky was full of what looked like huge bats, all swooping and screeching and diving around the car, which was going about 100 miles an hour with the top down to Las Vegas. And a voice was screaming: “Holy Jesus! What are these goddamn animals?” [0]
[0] Thompson, Hunter S., „Fear and Loathing in Las Vegas“ | | |
| ▲ | fipar 5 days ago | parent [-] | | Hopefully we can expect FSD systems not to act like humans on hallucinogens though, right? :) | | |
| ▲ | hodgesrm 4 days ago | parent [-] | | One hopes so. Many of the comments assume an ideal human driver, whereas real human drivers are frequently tired, distracted, intoxicated, or just crazy. |
|
| |
| ▲ | ben_w 5 days ago | parent | prev [-] | | Many accidents are caused by low-angle light dazzle. It's part if why high beams aren't meant to be used off a dual carriageway. When was the last time you saw a paper bag blown across the street and mistook it for a cat or a fox? (Did you even notice your mistake, or do you still think it was an animal?) Do you naturally drive faster on wide streets, slower on narrow streets, because the distance to the side of the road changes your subconcious feeling of how fast you're going? Do you even know, or are you limited to your memories rather than a dashcam whose footage can be reviewed later? etc. Now don't get me wrong, AI today is, I think, worse than humans at safe driving; but I'm not sure how much of that is that AI is more hallucinate-y than us vs. how much of it is that human vision system failures are a thing we compensate for (or even actively make use of) in the design of our roads, and the AI just makes different mistakes. | | |
| ▲ | davidhs 5 days ago | parent [-] | | If the internal representation of Tesla Autopilot is similar to what the UI displays, i.e. the location of the w.r.t. to everything else, and we had a human whose internal representation is similar, everything jumping around in consciousness, we’d be insane to allow him to drive. Self-driving is probably “AI-hard” as you’d need extensive “world knowledge” and be able to reason about your environment and tolerate faulty sensors (the human eyes are super crappy with all kinds of things that obscure it, such as veins and floaters). Also, if the Waymo UI accurately represents what it thinks is going on “out there” it is surprisingly crappy. If your conscious experience was like that when you were driving you’d think you had been drugged. | | |
| ▲ | ben_w 5 days ago | parent [-] | | I agree that if Tesla's representation of what their system is seeing is accurate, it's a bad system. The human brain's vision system makes pretty much the exact opposite mistake, which is a fun trick that is often exploited by stage magicians: https://www.youtube.com/watch?v=v3iPrBrGSJM&pp And is also emphasised by driving safety awareness videos: https://www.youtube.com/watch?v=LRFMuGBP15U I wonder what we'd seem like to each other, if we could look at each other's perception as directly as we can look at an AI's perception? Most of us don't realise how much we mispercieve because it doesn't feel different in the moment to percieve incorrectly; it can't feel different in the moment, because if it did, we'd notice we were mispercieving. |
|
|
|
|
| ▲ | ethbr1 5 days ago | parent | prev | next [-] |
| > Anyway, I don't think Tesla were wrong to not use LIDAR - they had good reasons to not go down that route. They were excessively expensive and the old style spinning LIDARs were not robust. You could not have sold them on a production car in 2018. The correct move for Tesla would have been to split the difference and add LIDAR to some subset of their fleet, ideally targeted in the most difficult to debug environments. Somewhat like Google/Waymo are doing with their Jaguars. Don't LIDAR 100% of Teslas, but add it to >0%. |
| |
| ▲ | ACCount37 5 days ago | parent [-] | | Tesla did, in fact, use "ground truth vehicles" - vehicles that were owned and operated by Tesla itself, and had high performance LIDARs installed. They were used to collect the data to train the "vision-only" system and verify its performance. Reportedly, they no longer use this widely - but they still have some LIDAR-equipped "scout vehicles" they send into certain environments to collect extra data. | | |
| ▲ | ethbr1 4 days ago | parent [-] | | It seems like an own goal not to sell these to some interested and targeted customers then. | | |
| ▲ | ACCount37 4 days ago | parent [-] | | Who would buy those and why? They don't use LIDARs for better self-driving somehow. They're just data harvesting units with wheels. And I don't think there's a large and underserved market for LIDARs on wheels. | | |
| ▲ | ethbr1 4 days ago | parent [-] | | > Who would buy those and why? [...] They're just data harvesting units with wheels. Tesla would subsidize them and offer them at the same price as non-LIDAR models, to select customers in target areas. And yes, you answered the second part of your own question. |
|
|
|
|
|
| ▲ | marcos100 5 days ago | parent | prev | next [-] |
| I want my self-driving car to be a better driver than any human. Sure we can drive without LIDAR, but just look up the amount of accidents caused by humans. |
| |
| ▲ | paulryanrogers 5 days ago | parent [-] | | Humans cause one fatal accident per million miles. (They have no backup driver they can disengage to.) Now just look up how many disengagements per million miles Tesla has. | | |
| ▲ | Eisenstein 5 days ago | parent [-] | | Can you make your point without the stat, or provide the stat for us please? |
|
|
|
| ▲ | lukeschlather 5 days ago | parent | prev | next [-] |
| I had taken for granted that the cameras in the Tesla might be equivalent to human vision, but now I'm realizing that's probably laughable. I'm reading it's 8 cameras at 30fps and it sounds like the car's bus can only process about 36fps (so a total of 36fps, not 8x30 = 240fps theoretically available from the cameras, if they had a better memory bus.) It also seems plausible you would need at least 10,000 FPS to fully match human vision (especially taking into account that humans turn their heads which in a CV situation could be analogous to the CV algorithm having 32x30 = 960 FPS, but typically only processing 140 frames this second from cameras pointing in a specific direction. So maybe LIDAR isn't necessary but also if Tesla were actually investing in cameras with a memory bus that could approximate the speed of human vision I doubt it would be cheaper than LIDAR to get the same result. |
| |
| ▲ | tialaramex 5 days ago | parent | next [-] | | Mostly human vision is just violently different from a camera, but you could interpret that as a mix of better and worse. One of the ways it's better is that humans can sense individual photons. Not 100% reliably, but pretty well, which is why humans can see faint stars on a dark night without any special tools even though the star is thousands of light years away. On the other hand, our resolution for most of our field of vision is pretty bad - this is compensated for by changing what we're looking it when we care about details we can just look directly at it and the resolution is better right in the centre of the picture. | |
| ▲ | asats 5 days ago | parent | prev [-] | | Also the human vision is backed by the general intelligence, which those cameras are very much not. |
|
|
| ▲ | DennisP 5 days ago | parent | prev | next [-] |
| You might not have the classical guardrails, but you are providing the neural net with a lot more information. Even humans are starting to find it useful to get inputs from other sensor types in their cars. I agree that Tesla may have made the right hardware decision when they started with this. It was probably a bad idea to lock themselves into that path by over-promising. |
|
| ▲ | phinnaeus 5 days ago | parent | prev | next [-] |
| Humans have the most sophisticated processing unit in the known universe to handle the data from the eyes. Is the brain a crutch? |
| |
| ▲ | bayindirh 5 days ago | parent [-] | | At least for one marine creature, which I forgot its name, the answer is yes. Said creature dissolves its brain the moment it can find a place to attach and call home. | | |
| ▲ | shagie 4 days ago | parent | next [-] | | Sea squirt. One of the simplest members of Chordata. https://en.wikipedia.org/wiki/Ascidiacea | |
| ▲ | chronogamous 5 days ago | parent | prev [-] | | Can't think of the name atm either, but I'm pretty sure it only does so, as it would be pointless to make any further decisions after attaching itself - it simply has no means to act on anything after that... the attaching is the only thing it 'does' in it's life... after that, it's only job, and only ability, is to be. Chose the wrong spot to attach and call home? Brains wouldn't make a bit of difference (unless regretting it's one life-choice is somehow usefull during this stage of just being, being stuck on the spot). |
|
|
|
| ▲ | uoaei 5 days ago | parent | prev | next [-] |
| This impulse to limit robots to the capacities, and especially the form factors, of humans has severely limited our path to progress and a more convenient life. Robots are supposed to make up for our limitations by doing things we can't do, not do the things we can already do, but differently. The latter only serves to replace humans, not augment them. |
|
| ▲ | DonHopkins 5 days ago | parent | prev | next [-] |
| I'd rather cars have crutches than the people they run over. |
|
| ▲ | fluidcruft 5 days ago | parent | prev | next [-] |
| Musk's argument "Humans don't have LIDAR, therefore LIDAR is useless" has always seemed pretty dumb to me. It ignores the possibility that LIDAR might be superhuman with superhuman performance. And we also know you can get superhuman performance on certain tasks with insect-scale brains. Musk's just spewing stoner marketing crap that stoners think is deep, not actual engineering savvy. (and that's not even addressing that human vision is fundamentally a weird sensory mess full of strange evolutionary baggage that doesn't even make sense except for genetic legacy) |
| |
| ▲ | mixedbit 5 days ago | parent [-] | | Musk's argument also ignores intelligence of humans. The worst case upper bound for reaching human level driving performance without LIDAR is for AI to reach human level intelligence. Perhaps it is not required, but until we see self-driving Teslas performing as well as humans, we won't know this. Worst case scenario is that Tesla unsupervised self-driving is as far away as AGI. |
|
|
| ▲ | maxerickson 5 days ago | parent | prev | next [-] |
| You could write a rant like this about 4 vs 3 wheels. |
|
| ▲ | inciampati 5 days ago | parent | prev | next [-] |
| I wish I had radar eyes |
| |
| ▲ | UltraSane 5 days ago | parent [-] | | I want to see gamma rays, I want to hear X-rays, and I want to smell dark matter. | | |
| ▲ | RaftPeople 5 days ago | parent [-] | | "I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate". |
|
|
|
| ▲ | ModernMech 5 days ago | parent | prev [-] |
| > Vision systems were improving a lot back then so the idea you could have a FSD on vision alone was plausible. This was only plausible to people who had no experience in robotics, autonomy, and vision systems. Everyone knew LIDAR was the enabling technology thanks to the 2007 DARPA Urban challenge. But the ignoramus Elon Musk decided he knew better and spent the last decade+ trashing the robotics industry. He set us back as far as safety protocols in research and development, caused the first death due to robotic cars, deployed them on public roads without the consent of the public by hoisting around his massive wealth, lied consistently for a DECADE about the capabilities of these machines, defrauded customers and shareholders while becoming richer and richer, all to finally admit defeat while he still maintains the growth story of for Tesla's future remains in robotics. The nerve of this fucking guy. |