Remix.run Logo
d_sem 5 days ago

My experience working in an automotive supplier suggest that Tesla engineers must have always knowns this and the real strategy was to provide the best ADAS experience with the cheapest sensor architecture. They certainly did achieved that goal.

There were aspirations that the bottom up approach would work with enough data, but as I learned about the kind of long tail cases that we solved with radar/camera fusion, camera-only seemed categorically less safe.

easy edge case: A self driving system cannot be inoperable due to sunlight or fog.

a more hackernew worthy consideration: calculate the angular pixel resolution required to accurately range and classify an object 100 meters away. (roughly the distance needed to safely stop if you're traveling 80mph) Now add a second camera for stereo and calculate the camera-to-camera extrinsic sensitivity you'd need to stay within to keep error sufficiently low in all temperature/road condition scenarios.

The answer is: screw that, I should just add a long range radar.

there are just so many considerations that show you need a multi-modality solution, and using human biology as a what-about-ism, doesn't translate to currently available technology.

brandonagr2 4 days ago | parent [-]

Tesla does not use stereo/binocular vision, that's not how humans perceive relative motion at that distance either, we would depend on perspective and parallax