| ▲ | mkl 6 days ago |
| Waymo is very restricted on the locations it drives (limit parts of limited cities, I think no freeways still), and uses remote operators to make decisions in unusual situations and when it gets stuck. This article from last year has quite a bit of information: https://arstechnica.com/cars/2024/05/on-self-driving-waymo-i... |
|
| ▲ | panarky 6 days ago | parent | next [-] |
| Waymo never allows a remote human to drive the car. If it gets stuck, a remote operator can assess the situation and tell the car where it should go, but all driving is always handled locally by the onboard system in the vehicle. Interesting that Waymo now operates just fine in SF fog, and is expanding to Seattle (rain) and Denver (snow and ice). |
| |
| ▲ | epcoa 6 days ago | parent [-] | | The person you're replying to never claimed otherwise. However, while decision support is not directly steering and accelerating/braking the car, I am just going to assert it is still driving the car, at least for how it actually matters in this discussion.
And the best estimate is that these interventions are "uncommon" on the order of 10ks miles, but that isn't rare. A system that requires a "higher level" handler is not full self driving. | | |
| ▲ | ascorbic 5 days ago | parent | next [-] | | I think the important part is that the remote person doesn't need to be alert, and make real time decisions within seconds. As I understand it, the remote driver is usually making decisions with the car stationary. I'd imagine that any future FSD car with no steering wheel would probably have a screen for the driver to make those kind of decisions. | |
| ▲ | AlotOfReading 6 days ago | parent | prev | next [-] | | There's a simple test I find useful to determine who's driving: If the vehicle has a collision, who's ultimately responsible? That person (or computer) is the driver. If a Waymo hits a pole for example, the software has a bug. It wasn't the responsibility of a remote assistant to monitor the environment in real time and prevent the accident, so we call the computer the driver. If we put a safety driver in the seat and run the same software that hits the same pole, it was the human who didn't meet their responsibility to prevent the accident. Therefore, they're the driver. | |
| ▲ | panarky 6 days ago | parent | prev | next [-] | | Agreed! Which is why an autonomous car company that is responsible and prioritizes safety would never call their SAE Level 4 vehicle "full self-driving". And that's why it's so irresponsible and dangerous for Tesla to continue using that marketing hype term for their SAE Level 2 system. | |
| ▲ | standardUser 6 days ago | parent | prev [-] | | In that case, it sounds like "full self driving" is more of an academic concept that is probably past it's due date. Waymo and Apollo Go are determining what the actual requirements are for an ultra-low labor automated taxi service by running them successfully. |
|
|
|
| ▲ | phire 6 days ago | parent | prev | next [-] |
| Geofencing and occasional human override meets the definition of "Level 4 self driving". Especially when it's a remote human override. But is Level 4 enough to count as "Full Self Driving"? I'd argue it really depends on how big the geofence area is, and how rare interventions are. A car that can drive on 95% of public roads might as well be FSD from the perspective of the average drive, even if it falls short of being Level 5 (which requires zero geofencing and zero human intervention). |
|
| ▲ | zer00eyz 6 days ago | parent | prev | next [-] |
| Waymo has been testing freeway driving for a bit: https://www.reddit.com/r/waymo/comments/1gsv4d7/waymo_spotte... > and uses remote operators to make decisions in unusual situations and when it gets stuck. This is why its limited markets and areas of service: connectivity for this sort of thing matters. Your robotaxi crashing cause the human backup lost 5g connectivity is gonna be a real real bad look. NO one is talking about their intervention stats. IF they were good I would assume that someone would publish them for marketing reasons. |
| |
| ▲ | decimalenough 6 days ago | parent | next [-] | | > Your robotaxi crashing cause the human backup lost 5g connectivity is gonna be a real real bad look. Waymo navigates autonomously 100% of the time. The human backup's role is limited to selecting the best option if the car has stopped due to an obstacle it's not sure how to navigate. | |
| ▲ | refulgentis 6 days ago | parent | prev [-] | | > NO one is talking about their intervention stats. Interventions are a term of art, i.e. it has a specific technical meaning in self-driving. A human taking timely action to prevent a bad outcome the system was creating, not taking action to get unstuck. > IF they were good I would assume that someone would publish them for marketing reasons. I think there's an interesting lens to look at it in: remote interventions are massively disruptive, the car goes into a specific mode and support calls in to check in with the passenger. It's baked into UX judgement, it's not really something a specific number would shed more light on. If there was a significant problem with this, it would be well-known given the scale they operate at now. |
|
|
| ▲ | FireBeyond 5 days ago | parent | prev | next [-] |
| > I think no freeways still California granted Waymo the right to operate on highways and freeways in March 2024. |
|
| ▲ | standardUser 6 days ago | parent | prev [-] |
| All cars were once restricted in the locations they could drive. EVs are restricted today. I don't see why universal access is a requirement for a commercially viable autonomous taxi service, which is what Waymo is currently. And the need for human operators seems obvious for any business, no matter how autonomous, let alone a business operating in a cutting edge and frankly dangerous space. |
| |
| ▲ | shadowgovt 6 days ago | parent | next [-] | | It's by definition in terms of how these things are counted. L4 is "full autonomy, but in a constrained environment."
L5 is the holy grail: as good as or better than human in every environment a human could take a car (or, depending on who's doing the defining: every road a human could take a car on. Most people don't say L5 and mean "full Canyonero"). | | |
| ▲ | yencabulator 5 days ago | parent | next [-] | | > or, depending on who's doing the defining: every road a human could take a car on. That's a distinction without a difference. Forest service and BLM roads are "roads" but can be completely impassable or 100% erased by nature (and I say this as a former Jeep Wrangler owner), they aren't always located where a map thinks they are, and sometimes absolutely nothing differentiates them from the surrounding nature -- for example, left turn into a desert dry wash can be a "road" and right not. Actual "full" autonomous driving is crazy hard. Like, by definition you get into territory where some vehicles and some drivers just can't make it through, but it's still a road(/"environment"). And some people will live at the end of those roads. | |
| ▲ | standardUser 5 days ago | parent | prev [-] | | These definitions appear to be largely academic and now outdated. |
| |
| ▲ | pavel_lishin 6 days ago | parent | prev [-] | | > EVs are restricted today. Are they? Did you mean Autonomous Vehicles? | | |
| ▲ | standardUser 6 days ago | parent [-] | | No, you can't go driving off into an area with no charging options, which would be much of the world. | | |
| ▲ | yencabulator 5 days ago | parent [-] | | Did you know that a gas car can also run out of gas? | | |
| ▲ | standardUser 5 days ago | parent [-] | | Yes, and before gas stations were widespread you couldn't drive gas cars anywhere you wanted either, dummy. |
|
|
|
|