Remix.run Logo
dreamcompiler 6 days ago

Not gonna happen as long as Musk is CEO. He's hard over on a vision-only approach without lidar or radar, and it won't work. Companies like Waymo that use these sensors and understand sensor fusion are already eating Tesla's lunch. Tesla will never catch up with vision alone.

Rohansi 6 days ago | parent | next [-]

While I don't think vision-only is hopeless (it works for human drivers) the cameras on Teslas are not at all reliable enough for FSD. They have little to no redundancy and only the forward facing camera can (barely) clean itself. Even if they got their vision-only FSD to work nicely it'll need another hardware revision to resolve this.

vbezhenar 5 days ago | parent | next [-]

I feel like our AI research at physical world falls so much behind language-level AI, that our reasoning might be clouded.

Compare Boston Dynamics and cat. They are on the absolutely different levels for their bodies and their ability to manipulate their bodies.

I have no doubts, that using cameras-only would absolutely work for AI cars, but at the same time I'm feel that this kind of AI is not there. And if we want autonomous cars, it might be possible, but we need to equip them with as much sensors as necessary, not setting any artificial boundaries.

threatofrain 5 days ago | parent [-]

But lidar is basically a cheat code, whether or not optical is sufficient. Why wait for end stage driving AI? Why not use cheat codes and wait for cheaper technology later?

Rohansi 5 days ago | parent [-]

I honestly think Tesla is past the point where lidar would provide significant benefits. I've tried FSD for a month or two and it can see everything but just drives like an idiot. Lidar isn't going to help it merge properly, change lanes smoothly, take left turns at lights without blocking traffic, etc.

Check out what the Tesla park assist visualization shows now. It's vision based and shows a 3D recreation of the world around the car. You can pan around to see what's there and how far away it is. It's fun to play around with in drive thrus, garages, etc. just to see what it sees.

threatofrain 5 days ago | parent [-]

It should help for disambiguating scenarios that lead to phantom stops or not stopping on time, which has killed Tesla drivers before, such as by driving full speed into the back of a truck with some glare.

Rohansi 5 days ago | parent [-]

Maybe? I don't remember the cases but there is some confusion with autopilot (cruise control) vs. FSD sometimes. Autopilot is a completely different system and nobody should be surprised if it leads to crashes when misused.

moogly 6 days ago | parent | prev | next [-]

> While I don't think vision-only is hopeless (it works for human drivers)

I guess you don't drive? You use more senses than just vision when driving a car.

figassis 5 days ago | parent | next [-]

Behavioral and pattern analysis is always in full overdrive when I drive. I drive in Africa, people never follow rules, red lights at crossings mean go for bikers, when there are no lights you can't just give right of way, or you'll never move. When nearing intersections, people accelerate so they can cross before you, and it's a long line, and they know you have right of way, so they accelerate to scare you into stopping. Amateurs freeze and hold up the line for a very long time, usually until a traffic officer shows up to unblock (multiply this by every intersection). In order for you to get anywhere, you have to play the same game, and get close enough to the point where they aren't sure you'll stop, and will hit you and will have to pay. So often at crossings you're always in near misses and they realize you're not going to stop, so they do. Everyone is skilled enough to do this daily. Your senses, your risk analysis, your spider sense are fully overclocked most of the time. And then there are all the other crazy things that happen, like all the insane lane zig zagging people do, bikers our of nowhere et night with no lights, memorizing all the pot holes in all roads in the city bc they aren't illuminated at night so you can drive at 80-120km/h, etc. So no, it's not just your eyes. Lots of sensors, memory, processing, memory/mapping are required.

bhaney 5 days ago | parent | prev | next [-]

Personally, I can smell a left turn signal from nearly three blocks away

okr 5 days ago | parent [-]

The spider crawling out of the back of the car mirror has seen things, that are far beyond i will ever experience visually!

Rohansi 6 days ago | parent | prev | next [-]

And which ones can't be replicated with hardware?

scrollaway 5 days ago | parent | next [-]

Even without getting out of the vision sense there are features of vision Tesla doesn’t properly try to replicate. Depth perception for example (it does DP very differently to humans).

You also do use your ears when driving.

rogerrogerr 5 days ago | parent | next [-]

Binocular depth perception stops being useful somewhere around 10 meters. Your brain is mostly driving using the “computed” depth perception based on the flat image it’s getting. Same way Tesla is getting a depth map.

Provable by one-eyed people being able to drive just fine, as could you with one eye covered.

vbezhenar 5 days ago | parent | prev | next [-]

One-eyed people are allowed to drive.

scrollaway 5 days ago | parent [-]

What’s your point? I was answering a question, not making a statement about any disabilities.

vbezhenar 5 days ago | parent [-]

My point is that "hardware" depth perception is not necessary for successful driving. Just one camera should be enough, rest is algorithms.

ModernMech 5 days ago | parent [-]

Eyes are not cameras they are extensions of the brain. That people can drive with one eye is not a "proof of concept" that cars should be able to drive with one camera. You'll need a human brain to go along with it. Unfortunately for Tesla, they seem to be short on supply of those at the moment.

rogerrogerr 5 days ago | parent [-]

So your assertion is that a human with access to arbitrarily good camera feeds could not drive a car at level 5? That something magical is happening because the eyes are close topographically to the brain? Sounds implausible.

ModernMech 5 days ago | parent | next [-]

How does the human consume the arbitrarily good camera feeds?

> That something magical is happening because the eyes are close topographically to the brain?

It sounds to me like you have to study what eyes actually are. It's not about proximity or magic, they are a part of your brain, and we're only beginning to understand their complexities. Eyes are not just sensory organs, so the analogy to cameras is way off. They are able to discern edges, motion, color, and shapes, as well as correct errors before your brain even is even aware.

In robotics, we only get this kind of information after the camera image has been sent through a perception pipeline, often incurring a round trip through some sort of AI and a GPU at this point.

> Sounds implausible.

Musk just spent billions of dollars and the better part of a decade trying to prove the conjecture that "cameras are sufficient", and now he's waving the white flag. So however implausible it sounds, it's now more implausible than ever that cameras alone are sufficient.

JumpCrisscross 4 days ago | parent | prev [-]

> your assertion is that a human with access to arbitrarily good camera feeds could not drive a car at level 5?

No. I live in snow country. Folks with vestibular issues are advised to pull over in snowstorms because sometimes the only indication that you have perpendicular velocity and are approaching a slide off the road or spin is that sense. My Subaru has on more than one occasion noticed a car before I did based on radar.

Vision only was a neat bet. But it will cost Tesla first to market status generally and especially in cities, where regulators should have fair scepticism about a company openly trying to do self driving on the cheap.

rogerrogerr 4 days ago | parent [-]

Teslas definitely have accelerometers/gyros, and have access to the torque and RPM on every wheel. It has a much better picture of the 3D motion of the car relative to the road than any human driver.

ModernMech 4 days ago | parent [-]

Dynamics don't help when you are blinded by the sun or can't discern the broadside of a firetruck.

rogerrogerr 4 days ago | parent [-]

Cameras can clearly discern the broadside of a firetruck. Whether some earlier build didn't detect one doesn't change that firetrucks reflect plenty of photons to be detectable.

I'm consistently surprised by how immune to sun-blindness my car is. It regularly reads traffic lights that have the sun right next to them; I've never seen any discernible degradation due to too much light, too little light, or bad contrast of any kind.

You're just bringing up a never-ending stream of but-what-abouts, so I'm done refuting them after this. It's not a good use of my time.

ModernMech 3 days ago | parent [-]

Your personal experience with your car doesn't change that Tesla is waving the white flag due to the fact the sensor system Musk insisted on has caused deaths and is too unreliable to deliver full autonomy. The sun has confounded Tesla autonomy since its inception, and its shortcomings caused multiple decapitations: https://www.latimes.com/business/la-fi-tesla-florida-acciden...

> You're just bringing up a never-ending stream of but-what-abouts

By "what abouts" you of course mean "shortcomings of camera-only systems that make them unsuitable for full autonomy."

> It's not a good use of my time.

No it's not, it's a losing battle, and Musk has admitted it. Camera-only systems will not enable full self driving. Y'all got scammed.

mlindner 5 days ago | parent | prev | next [-]

Teslas have and use microphones.

scrollaway 5 days ago | parent [-]

Gp asked about specifically vision only approach. Vision only means no microphones, regardless of whether Tesla has any…

What is up with hn today? Was there a mass stroke?

rogerrogerr 5 days ago | parent [-]

“Vision-only” colloquially means no LIDAR and other expensive sensors, not the exclusion of microphones (which are hilariously cheap).

dtj1123 5 days ago | parent | prev | next [-]

One eyed, deaf people can drive

gizajob 5 days ago | parent | prev [-]

Deaf people can drive fine.

ndesaulniers 6 days ago | parent | prev | next [-]

...taste?

moogly 6 days ago | parent | prev | next [-]

Ask Musk; he's the one who claims that sensor fusion does not work.

tester756 5 days ago | parent | prev [-]

intuition?

terminalshort 5 days ago | parent | prev | next [-]

Yeah, but you can drive on vision alone. Deaf people are allowed to drive just the same as anyone else.

asadotzler 5 days ago | parent [-]

It's not just hearing. I can "feel" in the seat of my pants, the pull of the steering wheel, et. I have a vestibular system that knows bout relative velocities and changes which coordinates with my other senses, and more. This all allows me to take in far more than what my eyes see, or my ears hear and to build the correct intuitions and muscle memories to get good at driving and adapt to new driving environments.

ndsipa_pomu 5 days ago | parent | prev | next [-]

> You use more senses than just vision when driving a car

Deaf drivers (may include drivers playing loud music too) don't, unless they're somehow tasting the other vehicles.

ChrisMarshallNY 5 days ago | parent | next [-]

We have these things called "inner ears." I'm pretty sure deaf people have them, too.

Nature's accelerometers.

I've had mine go bad, and it wasn't fun.

Just sayin'...

ndsipa_pomu 5 days ago | parent | next [-]

Were you unable to drive when your inner ears weren't functioning?

ChrisMarshallNY 5 days ago | parent [-]

I guess so.

I was unable to stand up.

ndsipa_pomu 5 days ago | parent [-]

Sounds horrible. I can understand that stopping you from cycling, but if you could have managed to sit in a car, would you have been able to drive it? I can imagine that inner ear issues can sometimes affect vision too as my wife suffered from positional vertigo for a while and I could see her eyes flicking rapidly when she was getting dizzy. (I did find a helpful YouTube video about a sequence of positions to put the sufferer through which basically helps to remove the otoliths from the ear canal).

ChrisMarshallNY 5 days ago | parent | next [-]

In my case, it was a brain tumor. Took a bit more than Lotus Position.

It all came out OK, in the end, but it was touch-and-go for a while.

ndsipa_pomu 5 days ago | parent [-]

Ouch!

Not quite a Lotus Position, but I used the Epley Maneuver on her which immediately lessened her symptoms: https://en.wikipedia.org/wiki/Epley_maneuver

robocat 5 days ago | parent | prev [-]

When the vertigo is bad, you can't even go as a passenger in the car because the movement is literally sickening.

Even driving with mild vertigo could be difficult because you want to restrict your head movement.

Source: my dad gets Benign paroxysmal positional vertigo (BPPV)

ndsipa_pomu 4 days ago | parent [-]

I'd recommend him trying the Epley Maneouvre as it's quick and easy to do (needs someone to help though) and is unlikely to make anything worse.

robocat 3 days ago | parent [-]

Thanks. I've tried to encourage him to learn it. He's stubborn and isn't interested. He's had physio do it when he was hospitalized...

He's mentally sharp, and has a science background, but nope!

asadotzler 5 days ago | parent | prev [-]

vestibular system

vel0city 5 days ago | parent | prev | next [-]

There are more than three senses.

ndsipa_pomu 5 days ago | parent [-]

Yes and they're not really of much use in driving safely unless you're referring to some spidey-sense of danger.

vel0city 5 days ago | parent | next [-]

I'm using inertial senses from my inner ear. I feel the suspension through the seat. I feel feedback through the steering wheel. I can feel the g forces pulling on my body.

ndsipa_pomu 5 days ago | parent [-]

Yes, but in what specific circumstances do they change your driving behaviour? If you weren't able to feel the suspension through your seat, how would your driving become less safe?

vel0city 5 days ago | parent | next [-]

One quick obvious example, they put tactile features on the road specifically so you can feel them. Little bumps on lane markers. Rumble strips on the boundaries. Obvious features like that.

While it doesn't often snow or ice up here (it does sometimes), it does rain a good bit from time to time. You can usually feel your car start to hydroplane and lose traction well before anything else goes wrong. It's an important thing to feel but you wouldn't know it's happening if you're going purely on vision.

You can often feel when there's something wrong with your car. Vibrations due to alignment or balance issues. Things like that.

Those are quick examples off the top of my head. I'm sure there are more.

Of course, all these things can be tracked with extra sensors, I'm not arguing humans are entirely unique in being able to sense these things. But they are important bits of feedback to operate your car safely in a wide range of conditions that you probably will encounter, and should be accounted for in the model.

As for auditory feedback, while some drivers don't have sound input available to them (whether they're deaf or their music is too loud or whatever) sound is absolutely a useful input to have. You may hear emergency vehicles you cannot see. You may hear honking alerting you to something weird going on in a particular direction. You may hear issues with your car. Those rumble strips are also tuned to be loud when cars run over them as well. You can hear the big wind gusts and understand those are the source of weird forces pushing the car around as opposed to other things making your car behave strangely. So sure, one can drive a car without sound, but its not better without it.

MangoToupe 5 days ago | parent | prev [-]

Pretty much all of them. The difference between driving a car and playing a video game is remarkable.

But that's sort of besides the point: why would you not use additional data when the price of the sensors are baked into the feature that you're selling?

tombert 5 days ago | parent | prev [-]

I am not 100% sure which “sense” this would be, but when I drive I can “feel” the texture of the road and intuit roughly how much traction I have. I’m not special, every driver does this, consciously or not.

I am not saying that you couldn’t do this with hardware, I am quite confident you could actually, but I am just saying that there are senses other than sight and sound at play here.

ndsipa_pomu 5 days ago | parent [-]

Whilst that might feel re-assuring that you're getting tactile feedback, I doubt that there's many situations apart from driving on snow and ice that it's of much use. Fair enough if you're aiming for a lap record round a track, but otherwise you shouldn't be anywhere near the limit of traction of your tyres.

tombert 5 days ago | parent [-]

Snow, ice, and rain are cases that still need to be accounted for so that really doesn’t dispel anything I said.

5 days ago | parent | prev [-]
[deleted]
renewiltord 6 days ago | parent | prev [-]

But we allow deaf people to drive but not people who are entirely blind. This means vision is necessary and sufficient.

The problem is clearly a question of the fidelity of the vision and our ability to slave a decision maker and mapper to it.

bkettle 6 days ago | parent | prev | next [-]

> it works for human drivers

Sure, for some definition of "works"...

https://www.iihs.org/research-areas/fatality-statistics/deta...

Rohansi 5 days ago | parent [-]

Vision is almost certainly not the main issue with humans as drivers.

NaomiLehman 5 days ago | parent [-]

it's one of the reasons.

Rohansi 5 days ago | parent [-]

For sure, but my phone camera sees better than I do. Cars can make use of better camera sensors and have more than two of them. You can't just extrapolate the conclusion that human vision bad = vision sensors bad.

NaomiLehman 5 days ago | parent | next [-]

we can't conclude that LIDAR is better than a camera? Is it worth cutting the costs? LIDAR has everything that a camera has plus more.

shpx 5 days ago | parent | prev [-]

Cameras are nowhere near the fidelity and responsiveness of human eyes.

SalmoShalazar 6 days ago | parent | prev | next [-]

Such utter drivel. A camera is not the equivalent of human eyes and sensory processing, let alone an entire human being engaging with the physical world.

terminalshort 5 days ago | parent | next [-]

Cameras are better than human eyes. Much better. There are areas in which they are worse, but that's completely outweighed by the fact that you are not limited to two of them and they can have a 360 degree field of vision.

FireBeyond 5 days ago | parent [-]

What garbage. The human eye has about 20 stops of dynamic range. Cameras of the size that are in a Tesla are at about 12 stops. That's a lot of data they don't get. For just one thing. Human eyes can also adjust focal distance multiple times a second, which camera (lenses) have a harder time doing.

terminalshort 4 days ago | parent [-]

For one tiny portion of the 360 field of vision of cameras, yes. For the rest they have 0 stops.

Rohansi 5 days ago | parent | prev [-]

The best cameras are surely better than most peoples' eyes these days.

Sensory processing is not matched, sure, but IMO how a human drives is more involved than it needs to be. We only have two eyes and they both look in the same direction. We need to continuously look around to track what's around us. It demands a lot of attention from us that we may not always have to spare, especially if we're distracted.

rcxdude 5 days ago | parent | next [-]

>The best cameras are surely better than most peoples' eyes these days.

Not on all metrics, especially not simultaneously. The dynamic range of human eyes, for example, is extremely high.

Rohansi 5 days ago | parent [-]

The front camera Tesla is using is very good with this. You can drive with the sun shining directly into it and it will still detect everything 99% of the time, at least with my older model 3. Way better than me stuck looking at the pavement directly in front the car.

AFAIK there is also more than one front camera. Why would anyone try to do it all with one or two camera sensors like humans do it?

It's important to remember that the cameras Tesla are using are optimized for everything but picture quality. They are not just taking flagship phone camera sensors and sticking them into cars. That's why their dashcam recordings look so bad (to us) if you've ever seen them.

kivle 5 days ago | parent | prev [-]

Well, Teslas use low cost consumer cameras. Not DSLRs. Bad framerate, bad resolution and bad dynamic range. Very far from human vision and easily blinded and completely washed out by sudden shifts in light.

matthewdgreen 4 days ago | parent | next [-]

You can compare the size of the cameras used in Tesla with the size (of the lenses at least) on the Waymo rig, and they do not look like they’re in the same league, optically.

rogerrogerr 5 days ago | parent | prev [-]

I’m consistently surprised by how my Tesla can see a traffic light with the sun directly behind it. They seem to have solved the washout problem in practice.

mbrochh 4 days ago | parent | prev [-]

Uh... why don't they put the cameras... into the car (it works for human drivers)???

formercoder 6 days ago | parent | prev | next [-]

Humans drive without LIDAR. Why can’t robots?

cannonpr 6 days ago | parent | next [-]

Because human vision has very little in common with camera vision and is a far more advanced sensor, on a far more advanced platform (ability to scan and pivot etc), with a lot more compute available to it.

torginus 5 days ago | parent | next [-]

I don't think it's a sensors issue - if I gave you a panoramic feed of what a Tesla sees on a series of screens, I'm pretty sure you'd be able to learn to drive it (well).

lstodd 6 days ago | parent | prev | next [-]

yeah, try matching a human eye on dynamic range and then on angular speed and then on refocus. okay forget that.

try matching a cat's eye on those metrics. and it is much simpler that human one.

terminalshort 5 days ago | parent | next [-]

Who cares? They don't need that. The cameras can have continuous attention on a 360 degree field of vision. That's like saying a car can never match a human at bipedal running speed.

dmos62 5 days ago | parent | prev [-]

I'm curious, in what ways is a cat's vision simpler?

lstodd 4 days ago | parent [-]

less far sight, dichromatic color vision, over-optimized for low light.

a cursory glance did not find studies on cat peripheral vision, but would assume it's worse than human if only because they rely more on audio

insane_dreamer 5 days ago | parent | prev [-]

The human sensor (eye) isn't more advanced in its ability to capture data -- and in fact cameras can have a wider range of frequencies.

But the human brain can process the semantics of what the eye sees much better than current computers can process the semantics of the camera data. The camera may be able to see more than the eye, but unless it understands what it sees, it'll be inferior.

Thus Tesla spontaneously activating its windshield wipers to "remove something obstructing the view" (happens to my Tesla 3 as well), whereas the human brain knows that there's no need to do that.

Same for Tesla braking hard when it encountered an island in the road between lanes without clear road markings, whereas the human driver (me) could easily determine what it was and navigate around it.

phire 6 days ago | parent | prev | next [-]

Why tie your hands behind your back?

LIDAR based self-driving cars will always massively exceed the safety and performance of vision-only self driving cars.

Current Tesla cameras+computer vision is nowhere near as good as humans. But LIDAR based self-driving cars already have way better situational awareness in many scenarios. They are way closer to actually delivering.

kimixa 6 days ago | parent | next [-]

And what driver wouldn't want extra senses, if they could actually meaningfully be used? The goal is to drive well on public roads, not some "Hands Tied Behind My Back" competition.

6 days ago | parent [-]
[deleted]
tliltocatl 5 days ago | parent | prev [-]

Because any active sensor is going to jam other such sensors once there are too many of them on the road. This is sad but true.

Sharlin 5 days ago | parent | prev | next [-]

And bird fly without radar. Still we equip planes with them.

apparent 6 days ago | parent | prev | next [-]

The human processing unit understands semantics much better than the Tesla's processing unit. This helps avoid what humans would consider stupid mistakes, but which might be very tricky for Teslas to reliably avoid.

randerson 6 days ago | parent | prev | next [-]

Even if they could: Why settle for a car that is only as good as a human when the competitors are making cars that are better than a human?

dotancohen 5 days ago | parent [-]

Cost, weight, and reliability. The best part is no part.

No part costs less, it also doesn't break, it also doesn't need to be installed, nor stocked in every crisis dealership's shelf, nor can a supplier hold up production. It doesn't add wires (complexity and size) to the wiring harness, or clog up the CAN bus message queue (LIDAR is a lot of data). It also does not need another dedicated place engineered for it, further constraining other systems and crash safety. Not to mention the electricity used, a premium resource in an electric vehicle of limited range.

That's all off the top of my head. I'm sure there's even better reasons out there.

randerson 5 days ago | parent | next [-]

These are all good points. But that just seems like it adds cost to the car. A manufacturer could have an entry-level offering with just a camera and a high-end offering with LIDAR that costs extra for those who want the safest car they can afford. High-end cars already have so many more components and sensors than entry-level ones. There is a price point at which the manufacturer can make them reliable, supply spare parts & training, and increase the battery/engine size to compensate for the weight and power draw.

terminalshort 5 days ago | parent | next [-]

We already have that. Tesla FSD is the cheap camera only option and Waymo is the expensive LIDAR option that costs ~150K (last time I heard). You can't buy a Waymo, though, because the price is not practical for an individually owned vehicle. But eventually I'm sure you will be able to.

asadotzler 5 days ago | parent [-]

LIDAR does not add $150K to the cost. Dramatically customizing a production car, and adding everything it needs costs $150K. Lidar can be added for hundreds of dollars per car.

dotancohen 5 days ago | parent [-]

  > Lidar can be added for hundreds of dollars per car.
Surprisingly, many production vehicles have a manufacturer profit under one thousand dollars. So that LIDAR would eat a significant portion of the margin on the vehicle.
matthewdgreen 4 days ago | parent [-]

But that’s sort of the point of the business model. Getting safe fully-self driving vehicles appears to require a better platform, given today’s limitations. You can achieve that better platform financially in a fleet vehicle where the cost of the sensors can be amortized over many rides, and the “FSD” capability translates directly into revenue. You can’t put an adequate sensor platform into a consumer vehicle today, which is what Tesla tried to promise and failed to deliver. Maybe someday it will be possible, but the appropriate strategy is to wait until that’s possible before selling products to the consumer market.

dotancohen 5 days ago | parent | prev [-]

Not with Teslas. There are almost no options on a Tesla - it's mostly just colours and wheels once you've selected a drivetrain.

dygd 5 days ago | parent | prev [-]

Teslas use automotive Ethernet for sensor data which has much more bandwidth compared to CAN bus

dotancohen 5 days ago | parent [-]

But also higher latency. Teslas also use a CAN bus.

But LIDAR would probably be wired more directly to the computer then use a packet protocol.

systemswizard 6 days ago | parent | prev | next [-]

Because our eyes work better than the cheap cameras Tesla uses?

lstodd 6 days ago | parent [-]

problem is, expensive cameras that Tesla doesn't use don't work either.

systemswizard 6 days ago | parent [-]

They cost 20-60$ to make per camera depending on the vehicle year and model. They also charge $3000 per camera to replace them…

MegaButts 6 days ago | parent | next [-]

I think his point was even if you bought insanely expensive cameras for tens of thousands of dollars, they would still be worse than the human eye.

terminalshort 5 days ago | parent | prev | next [-]

They charge $3000 for the hours of labor to take apart the car, pull the old camera out, put the new camera in, and put the car back together, not for the camera. You can argue that $3000 is excessive, but to compare it to the cost of the camera itself is dishonest.

dzhiurgis 6 days ago | parent | prev [-]

Fender camera is like $50 and requires 0 skill to replace. Next.

dreamcompiler 6 days ago | parent | prev | next [-]

Chimpanzees have binocular color vision with similar acuity to humans. Yet we don't let them drive taxis. Why?

ikekkdcjkfke 5 days ago | parent | next [-]

Chimpanzies are better than humans given a reward structure they understand. The next battlefield evilution are chimpanzies hooked up with intravenous cocaine modules running around with 50. cals

ndsipa_pomu 5 days ago | parent | prev | next [-]

There's laws about mis-treating animals. Driving a taxi would surely count as inhumane torture.

insane_dreamer 5 days ago | parent | prev | next [-]

they can't understand how to react to what they see the way humans do

it has to do with the processing of information and decision-making, not data capture

Y_Y 5 days ago | parent | prev [-]

This is plainly untrue, see e.g. https://www.youtube.com/watch?v=sdXbf12AzIM

matthewdgreen 4 days ago | parent | prev | next [-]

I drove into the setting sun the other day and needed to shift the window shade and move my head carefully to avoid having the sun directly in my field of vision. I also had to run the wipers to clean off a thin film of dust that made my windshield difficult to see through. And then I still drove slowly and moved my head a bit to make sure I could see every obstacle. My Tesla doesn’t necessarily have the means to do all of these things for each of its cameras. Maybe they’ll figure that out.

rudolftheone 4 days ago | parent | prev | next [-]

Here's a good demonstration why LIDAR SHOULD be implemented instead of what Tesla tries to sell: https://www.youtube.com/watch?v=IQJL3htsDyQ

zeknife 5 days ago | parent | prev | next [-]

I wouldn't trust a human to drive a car if they had perfect vision but were otherwise deaf, had no proprioception and were unable to walk out of their car to observe and interact with the world.

dotancohen 5 days ago | parent [-]

And yet deaf people regularly drive cars, as do blind-in-one-eye people, and I've never seen somebody leave their vehicle during active driving.

zeknife 5 days ago | parent | next [-]

I didn't mean that a human driver needs to leave their vehicle to drive safely, I mean that we understand the world because we live in it. No amount of machine learning can give autonomous vehicles a complete enough world model to deal with novel situations, because you need to actually leave the road and interact with the world directly in order to understand it at that level.

Y_Y 5 days ago | parent | prev [-]

> I've never seen somebody leave their vehicle during active driving.

Wake me up when the tech reaches Level 6: Ghost Ride the Whip [0].

[0] https://en.wikipedia.org/wiki/Ghost_riding

Waterluvian 6 days ago | parent | prev | next [-]

They can. One day. But nobody can just will it to be today.

rcpt 5 days ago | parent | prev | next [-]

We crash a lot.

insane_dreamer 5 days ago | parent [-]

that's (usually) because our reflexes are slow (compared to a computer), or we are distracted by other things (talking, phone, tiredness, sights, etc. etc.), not because we misinterpret what we see

nkrisc 6 days ago | parent | prev [-]

Well these robots can’t.

dzhiurgis 6 days ago | parent | prev [-]

So robotaxi trial thats happening already is some sort of rendering, ai slop and rides we see aren’t real?