Remix.run Logo
senordevnyc 6 hours ago

Yeah, but your "cameras" also have a bunch of capabilities that hardware cameras don't, plus they're mounted on a flexible stalk in the cockpit that can move in any direction to update the view in real-time.

Also, humans kinda suck at driving. I suspect that in the endgame, even if AI can drive with cameras only, we won't want it to. If we could upgrade our eyeballs and brains to have real-time 3D depth mapping information as well as the visual streams, we would.

ACCount37 5 hours ago | parent [-]

What "a bunch of capabilities"?

A complete inability to get true 360 coverage that the neck has to swivel wildly across windows and mirrors to somewhat compensate for? Being able to get high FoV or high resolution but never both? IPD so low that stereo depth estimation unravels beyond 5m, which, in self-driving terms, is point-blank range?

Human vision is a mediocre sensor kit, and the data it gets has to be salvaged in post. Human brain was just doing computation photography before it was cool.

Edman274 4 hours ago | parent [-]

What do you believe the frame rate and resolution of Tesla cameras are? If a human can tell the difference between two virtual reality displays, one with a frame rate of 36hz and a per eye resolution of 1448x1876, and another display with numerically greater values, then the cameras that Tesla uses for self driving are inferior to human eyes. The human eye typically has a resolution from 5 to 15 megapixels in the fovea, and the current, highest definition automotive cameras that Tesla uses just about clears 5 megapixels across the entire field of view. By your criterion, the cameras that Tesla uses today are never high definition. I can physically saccade my eyes by a millimeter here or there and see something that their cameras would never be able to resolve.

ACCount37 4 hours ago | parent [-]

Yep, Tesla's approach is 4% "let's build a better sensor system than what humans have" and 96% "let's salvage it in post".

They didn't go for the easy problem, that's for sure. I respect the grind.

Edman274 2 hours ago | parent [-]

I can't figure out your position, then. You were saying that human eyes suck and are inferior compared to sensors because human eyes require interpretation by a human brain. You're also saying that if self driving isn't possible with only camera sensors, then no amount of extra sensors will make up for the deficiency.

This came from a side conversation with other parties where one noted that driving is possible with only human eyes, another person said that human eyes are superior to cameras, you disagreed, and then when you're told that the only company which is approaching self driving with cameras alone has cameras with worse visual resolution and worse temporal resolution than human eyes, you're saying you respect the grind because the cameras require processing by a computer.

If I understand correctly, you believe:

1. Driving should be possible with vision alone, because human eyes can do it, and human eyes are inferior to camera sensors and require post processing, so obviously with superior sensors it must be possible 2. Even if one knows that current automotive camera sensors are not actually superior to human eyes and also require post processing, then that just means that camera-only approaches are the only way forward and you "respect the grind" of a single company trying to make it work.

Is that correct? Okay, maybe that's understandable, but it makes me confused because 1 and 2 contradict each other. Help me out here.

ACCount37 an hour ago | parent [-]

My position is: sensors aren't the blocker, AI is the blocker.

Tesla put together a sensor suite that's amenable to AI techniques and gives them good enough performance. Then they moved on to getting better FSD hardware and rolling out newer versions of AI models.

Tesla gets it. They located the hard problem and put themselves on the hard problem. LIDAR wankers don't get it. They point at the easy problem and say "THIS IS WHY TESLA IS BAD, SEE?"

Outperforming humans in the sensing dept wasn't "hard" for over a decade now. You can play with sensors all day long and watch real world driving performance vary by a measurement error. Because "sensors" was never where the issue was.