| ▲ | DarmokJalad1701 9 hours ago |
| I use it every single day and it is amazing! I have had my car for 6+ years now and it has only gotten better. Starting with "Navigate on Autopilot" when I originally got the car in 2019 which was pretty janky, it has only gotten better and better (with the rare regression in some cases). As of the last year or so, I don't even have to touch the steering wheel anymore! |
|
| ▲ | para_parolu 9 hours ago | parent | next [-] |
| It is a miracle consumer technology but it does mistakes. I have 40k miles in last 2 years mostly using fsd (since v13). And my anecdotal experience is that it drives itself well until it doesn’t. So far I was lucky enough to pay attention when it was critical. Which is hard to do because car doing good most of the time. |
|
| ▲ | Keyframe 9 hours ago | parent | prev | next [-] |
| I don't even have to touch the steering wheel anymore! Is that legal where you live? |
| |
| ▲ | DarmokJalad1701 9 hours ago | parent | next [-] | | It is legal. I still have to pay attention to the road. There is attention monitoring using the in-cabin camera which falls back to steering wheel torque detection if the cabin illumination is too low. | |
| ▲ | Workaccount2 9 hours ago | parent | prev | next [-] | | On ford and I believe Chevy's systems you can let go of the wheel, however their systems only work on pre-mapped roads, and those roads are 99% interstates and highways. | |
| ▲ | hgomersall 9 hours ago | parent | prev [-] | | Even if it's legal, it's pretty stupid. |
|
|
| ▲ | xnx 8 hours ago | parent | prev | next [-] |
| > I don't even have to touch the steering wheel anymore! What's the usefulness of this if you still have to pay attention at all times? |
| |
| ▲ | jiriro 8 hours ago | parent [-] | | The level of engagement during the FSD (I only oversee the FSD) is about 10% of the engagement during “I drive”. Nobody can be told what FSD is. You have to see it for yourself. |
|
|
| ▲ | ajross 9 hours ago | parent | prev | next [-] |
| Ditto. The volume of knee-jerk hatred these posts engender here really just fails in the face of the capabilities of the actual system. Like, it's OK to shout and scream about LIDAR and supervision and disengagement and all. But... it still drives itself! Really well. |
| |
| ▲ | Workaccount2 9 hours ago | parent [-] | | The problem is that if it drives well for 30,000 miles (unsupervised) on residential roads before steam rolling little billy on his bike, you will get a deluge of people who swear the system is excellent. But when you incorporate that tech into a fleet doing 100k residential miles a week with no supervisor, your mowing down 12 kids a month. | | |
| ▲ | ajross 9 hours ago | parent [-] | | How many kids has it steamrollered? Obviously not 12 a month! Seems like this is an argument to be had with real numbers, no? | | |
| ▲ | Workaccount2 8 hours ago | parent [-] | | Zero, because there isn't, and hasn't been, a single unsupervised Tesla on the road. There are no real numbers, because there are no real self-driving Teslas. | | |
| ▲ | ajross 8 hours ago | parent [-] | | So... that was exactly the point upthread. You're making a semantic argument over the proper definition for the word "real" when applied to autonomous vehicle systems. Nothing in this argument is actionable in any way. You can't conjure real dead kids, so you need to describe hypothetical ones. That's... yeah. Nonetheless, our cars drive us around anyway. Neither they, nor us, actually care about hypothetical steamrollered kids. | | |
| ▲ | Workaccount2 8 hours ago | parent [-] | | The argument is that Tesla needs to be doing hundreds of thousands of miles without intervention to be trusted for robotaxis. Most people using FSD don't come close to the mileage needed to get an idea of the safety level needed. If a Tesla robotaxi kills a kid, Tesla is done, and there won't be a coming back. So Tesla actually needs millions of miles without critical intervention before they can confidently let these things en masse out on the streets. A whole tesla fanboy meetup collectively will not have enough FSD miles to see something like that, but a robotaxi fleet will encounter it within a year. | | |
| ▲ | spankalee 7 hours ago | parent | next [-] | | I don't think Tesla will be done when they kill a kid. FSD already has killed people: https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe... | |
| ▲ | ajross 5 hours ago | parent | prev [-] | | > Tesla actually needs millions of miles without critical intervention So... agreed. I think that sounds like it's in the right ballpark. Here's the thing though: your whole argument (sort of a para-freakout, really) hinges on this evidence not existing. It's true that citing numbers from supervised cars isn't the same thing. It's not true to argue that it not being the same thing is (0) not at least a somewhat reasonable proxy for the evidence you want to see, (1) evidence that it doesn't or can't exist, (2) evidence for the opposite case (you seem to be claiming that the fact that it's supervised means that it must be), and in particular (3) evidence for suppression of contrary evidence, as some of your more conspiracy-leaning comments seem to imply. Isn't the Occam's explanation here that, yeah, the car looks pretty damn safe as shown by billions of miles of travel? Why must you be going to the mattresses to argue against something that seems pretty common sense to me? | | |
| ▲ | judahmeek an hour ago | parent [-] | | Occam's Razor is a principle for comparing explanations, not for making predictions. > It's not true to argue that it not being the same thing is (0) not at least a somewhat reasonable proxy for the evidence you want to see, (1) evidence that it doesn't or can't exist, (2) evidence for the opposite case (you seem to be claiming that the fact that it's supervised means that it must be), and in particular (3) evidence for suppression of contrary evidence, as some of your more conspiracy-leaning comments seem to imply. Actually, (0) is true. Numbers from supervised FSD are not a reasonable proxy for unsupervised FSD, especially if accidents may be occurring immediately after FSD disengages. (1) is also true, because if the evidence did exist, then Tesla would have already have publicized it. (2) is also true, because if Tesla thought they could employ unsupervised cars like Waymo, then they would have already have done that. As for (3), if Tesla had a reputation for transparency & honesty, then they could have provided additional data on accidents to show that accidents are not occurring right after FSD disengages at significant rates. |
|
|
|
|
|
|
|
|
| ▲ | 303uru 9 hours ago | parent | prev | next [-] |
| What kind of setting do you live in? I live in the outskirts of Denver (Cherry Creek) and commute downtown and I have to intervene all the time. |
| |
| ▲ | doph 9 hours ago | parent | next [-] | | I'm in Los Angeles, which can be a challenging place to drive. Each time they give me a free trial of FSD for a month, I enable it and test it with excitement and optimism. Each time I only use it for a day or two before it does something dangerous enough to scare me. | |
| ▲ | DarmokJalad1701 8 hours ago | parent | prev [-] | | Los Angeles suburb - my commute is either highways or city streets depending on the traffic. It works really well here. |
|
|
| ▲ | tomrod 9 hours ago | parent | prev [-] |
| Yeah, I'll be honest, that sounds awful. I like public transit and I like driving myself. Hybridizing with me in the drivers seat but not touching the wheel sounds annoying and tiresome. |
| |
| ▲ | doph 9 hours ago | parent | next [-] | | I share this opinion. I've used FSD quite a bit during the free trial periods and each time come away with the sense that it's like driving with a newly licensed teenager at the wheel. If I have to be as alert and ready to avoid an accident as when I'm in command of the car, then this offers no improvement to the experience, just an added layer of stress trying to anticipate the actions of yet another actor in the environment. | |
| ▲ | DarmokJalad1701 8 hours ago | parent | prev [-] | | My experience has been different. I find that not needing to have hyper-focus for extended periods of time with constant micro-adjustments has a big effect on fatigue - especially on long trips. Not needing to touch the wheel while using gaze-detection for attention tracking just reduces the annoyance IMO. I find it very similar to operating an airplane with a reliable autopilot. The GFC-700 is super good at what it does. But it is still on me to monitor what it is doing, while at the same time significantly reducing my workload. |
|