| ▲ | microtherion 11 hours ago |
| I'm quite skeptical of Tesla's reliability claims. But for exactly that reason, I welcome a company like Lemonade betting actual money on those claims. Either way, this is bound to generate some visibility into the actual accident rates. |
|
| ▲ | sfblah 9 hours ago | parent | next [-] |
| One thing that was unclear to me from the stats cited on the website is whether the quoted 52% reduction in crashes is when FSD is in use, or overall. This matters because people are much more likely to use FSD in situations where driving is easier. So, if the reduction is just during those times, I'm not even sure that would be better than a human driver. As an example, let's say most people use FSD on straight US Interstate driving, which is very easy. That could artificially make FSD seem safer than it really is. My prior on this is supervised FSD ought to be safer, so the 52% number kind of surprised me, however it's computed. I would have expected more like a 90-95% reduction in accidents. |
| |
| ▲ | wrsh07 8 hours ago | parent [-] | | I think this might be right, but it does two interesting things: 1) it let's lemonade reward you for taking safer driving routes (or living in a safer area to drive, whatever that means) 2) it (for better or worse) encourages drivers to use it more. This will improve Tesla's training data but also might negatively impact the fsd safety record (an interesting experiment!) | | |
| ▲ | paulryanrogers 6 hours ago | parent [-] | | > ...but also might negatively impact the fsd safety record (an interesting experiment!) As a father of kids in a neighborhood with a lot of Teslas, how do I opt out of this experiment? | | |
| ▲ | pfannkuchen 3 hours ago | parent | next [-] | | Do your kids randomly run into the road? I was worried about that but then mine just don’t run into the road for some reason, they are quite careful about it seemingly by default after having “getting bumped into by a car” explained to them. I’m not sure if this is something people are just paranoid about because the consequences are so bad or if some kids really do just run out into the road randomly. | |
| ▲ | almosthere 4 hours ago | parent | prev [-] | | https://www.zillow.com/homes/for_sale/?searchQueryState=%7B%... |
|
|
|
|
| ▲ | DaedalusII 5 hours ago | parent | prev | next [-] |
| The insurance industry is a commercial prediction market. It is often an indicator of true honesty, providing there is no government intervention. Governments intervene in insurance/risk markets when they do not like the truth. I tried to arrange insurance for an obese western expatriate several years ago in an Asian country, and the (western) insurance company wrote a letter back saying the client was morbidly obese and statistically likely to die within 10 years, and they should lose x weight before they could consider having insurance. |
| |
| ▲ | croddin 3 hours ago | parent [-] | | I could see prediction markets handing insurance in the future, it could probably get fairer prices but would have to be done right to avoid bad incentives, interesting to think about how that might work. |
|
|
| ▲ | JumpCrisscross 11 hours ago | parent | prev | next [-] |
| > quite skeptical of Tesla's reliability claims I'm sceptical of Robotaxi/Cybercab. I'm less sceptical that FSD, supervised, is safer than fully-manual control. |
| |
| ▲ | panopticon 9 hours ago | parent | next [-] | | Where I live isn't particularly challenging to drive (rural Washington), but I'm constantly disengaging FSD for doing silly and dangerous things. Most notably my driveway meets the road at a blind y intersection, and my Model 3 just blasts out into the road even though you cannot see cross traffic. FSD stresses me out. It's like I'm monitoring a teenager with their learners permit. I can probably count the number trips where I haven't had to take over on one hand. | | |
| ▲ | parpfish 7 hours ago | parent | next [-] | | > I'm constantly disengaging FSD for doing silly and dangerous things. You meant “I disable FSD because it does silly things” I read “I disable FSD so I can do silly things” | |
| ▲ | elif 8 hours ago | parent | prev | next [-] | | it's edging into the intersection to get a better view on the camera. it's further than you would normally pull out, but it will NOT pull into traffic. | | |
| ▲ | panopticon 7 hours ago | parent | next [-] | | It's not edging; it enters the street going a consistent speed (usually >10mph) from my driveway. The area is heavily wooded, and I don't think it "sees" the cross direction until it's already in the road. Or perhaps the lack of signage or curb make it think it has the right of way. My neighbor joked that I should install a stop sign at the end of my driveway to make it safer. | |
| ▲ | seanmcdirmid 8 hours ago | parent | prev [-] | | The software probably has a better idea of their car’s dimensions than a human driver, so will be able to get a better view of traffic by pulling out at just the right distance. |
| |
| ▲ | apearson 9 hours ago | parent | prev [-] | | Do you have HW3 or HW4? | | |
| ▲ | panopticon 8 hours ago | parent | next [-] | | HW3, unfortunately. Missed the HW4 refresh by a couple of months. | |
| ▲ | lotsofpulp 8 hours ago | parent | prev [-] | | The newest FSD on HW4 was very good in my opinion. Multiple 45min+ drives where I don’t need to touch the controls. Still not paying $8k for it. Or $100 per month. Maybe $50 per month. |
|
| |
| ▲ | madsmith 11 hours ago | parent | prev | next [-] | | Having handed over control of my vehicles to FSD many times, I’ve yet to come away from the experience feeling that my vehicle was operating in a safer regime for the general public than within my own control. | | |
| ▲ | Rover222 11 hours ago | parent | next [-] | | I think you greatly overestimate humans | | |
| ▲ | Retric 9 hours ago | parent | next [-] | | We aren’t talking about the average human here. On average you include sleep deprived people, driving way over the speed limit, at night, in bad weather, while drunk, and talking to someone. FSD is very likely situationally useful. But you can know most of those adverse conditions don’t apply when you engage FSD on a given trip. As such the standard needs to be extremely high to avoid increased risks when you’re sober, wide awake, the conditions are good, and you have no need to speed. | | |
| ▲ | izacus an hour ago | parent [-] | | > On average you include sleep deprived people, driving way over the speed limit, at night, in bad weather, while drunk, and talking to someone. FSD is very likely situationally useful. Are those people also able to suprevise FSD like the law and Tesla expects them to? That's also a question. |
| |
| ▲ | 10 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | ihaveajob 10 hours ago | parent | prev | next [-] | | The problem IMO is the transition period. A mostly safe system will make the driver feel at ease, but when an emergency occurs and the driver must take over, it's likely that they won't be paying full attention. | | | |
| ▲ | JumpCrisscross 9 hours ago | parent | prev [-] | | > you greatly overestimate humans Tesla's FSD still goes full-throttle dumbfuck from time to time. Like, randomly deciding it wants to speed into an intersection despite the red light having done absolutely nothing. Or swerving because of glare that you can't see, and a Toyota Corolla could discern with its radars, but which hits the cameras and so fires up the orange cat it's simulating on its CPU. |
| |
| ▲ | smileysteve 7 hours ago | parent | prev [-] | | Keeping a 1-2 car's length stopping distance is likely over a 50% reduction in at fault damages. |
| |
| ▲ | bayarearefugee 8 hours ago | parent | prev | next [-] | | > I'm less sceptical that FSD, supervised, is safer than fully-manual control. I'm very skeptical that the average human driver properly supervises FSD or any other "full" self driving system. | | |
| ▲ | microtherion 23 minutes ago | parent [-] | | Supervised FSD — automating 99.9% of driving and expecting drivers to be fully alert for the other .1% — appears to go against everything we know about human attention. |
| |
| ▲ | misiti3780 11 hours ago | parent | prev [-] | | this ^^ |
|
|
| ▲ | rubyfan 10 hours ago | parent | prev | next [-] |
| Lemonade will have some actual claim data to support this already, not relying on the word of Tesla. |
|
| ▲ | sMarsIntruder 2 hours ago | parent | prev | next [-] |
| They don’t bet money on just “I’m quite skeptical because I hate the man”, but on actual data provided by the company. That’s the difference. |
| |
|
| ▲ | benatkin 6 hours ago | parent | prev [-] |
| > betting actual money on those claims Insurance companies can let marketing influence rates to some degree, with programs that tend to be tacked on after the initial rate is set. This self driving car program sounds an awful lot like safe driver programs like GEICO Clean Driving Record, State Farm Good Driver Discount, and Progressive Safe Driver, Progressive Snapshot, and Allstate Drivewise. The risk assessment seems to be less thorough than the general underwriting process, and to fall within some sort of risk margin, so to me it seems gimmicky and not a true innovation at this point. |