"Tesla under conditions where autopilot works now has around a 1/8.5 rate [which we’ll round up by 16% and call it ~1:10] of accident per mile driven as compared to all cars in all driving conditions."
Should it come as any kind of surprise when phrased that way?
Parking lots and snow-covered roads are high crash-rate per mile driven scenarios to which autopilot is (largely) not exposed.
It honestly sounds like he was on a meme-making streak, and thought he'd throw one joke in there for people who understand that his statement means nothing
The tweet also mentions safety features other than autopilot. Having active safety features like pre-collision breaking or passive ones like 360 degree camera and sensors help a lot.
That’s not at all what it means. Tesla is comparing scenarios where autopilot actually works (the easy, low crash rate scenarios like highway driving) to driving everywhere (including high crash rate scenarios where autopilot specifically does not work) and saying “look how much safer it is!”. This is what we call bullshit statistics.
Comparing a luxury vehicle with ADAS features to the general NHTSA crash rates for all vehicles is also bullshit.
Autopilot can be used just about anywhere that the car can confidently figure out where the centre and edge of the roads are. I’m not sure what usage is like on average but I use it quite often in town as well, though the driving feels more explicitly collaborative when not on highways. I think it probably makes me a safer driver (certainly on highways and also) in town.
If they only considered the parts where Autopilot is enabled, it's obviously biased since people are going to enable Autopilot only where it is most likely to work and not kill them, and in general where accidents are less likely overall.
If they instead considered drivers who never use Autopilot vs those who sometimes use Autopilot, it would make more sense, but then it might be biased in other ways due to self-selection.
The only reasonable way to do this test is to randomly deny the use of Autopilot to some drivers who would otherwise use it and compare accident rates between those two groups (this still only applies to people who would want to use Autopilot, but it would be unethical to force people to use it since it's dangerous).
Ex-insurance company sales manager here. I agree that the insurance premiums are a good indicator but they will stay higher than normal vehicles because of the way insurance works. Reinsurers buy the potential risks in bulk and the way it is priced is putting similar risks and assets in a basket and then comparing the data they have about event possibility of the risk. The big data usually gives a pretty good indicator of the total number of accidents, incidents that will happen.
I have been away from the insurance sector for a couple years now, but before the electric car market was evaluated as its own class. This creates a huge problem because you can't distribute the claims optimally. If I simplify it you get (electric car insurance claims/electric cars that have insurance) + operation costs + profit %. Whereas in normal cars, your premium is decided based on the drivers age, previous accidents, car's age, car's color, the horse power, where the person lives, how much they drive etc. In the company I worked for we optimized the premium with over 200 data points.
You can't use the same optimization regarding electric cars because the market is so small and the data isn't there yet. Moreover insurers increase premiums to be on the safe side.
Why can't you just calculate the premium based on 201 data points instead of 200, the 201-point being whether it's an electric car, hybrid, or regular car?
I think the "200 data point" line means "200 cars that crashed" and that there are too few electric cars around to do analysis on "electric cars with 30k miles driven, and owned by people without college degrees" or whatever- so they just look at "all electric cars" in one bucket.
Exactly, Tesla premiums are very high and I think the biggest tell is how much of a dud the "Tesla insurance" business has been.
They continue to be just a brokerage in a handful of states so they don't even underwrite policies. If the margins were actually good they would have gone national and began underwriting policies themselves by now. Insurance is literally free money when you have a risk that competitors assess incorrectly.
Are they high compared to other cars in the same price range and similar cost to repair? Remember: insurance rates aren't just based on how often cars get into accidents; those that cost more to repair after an accident will have higher rates, too.
I don't understand his obsession with FSD, it's only an icing on the cake .. he got tesla running fine, spacex .. and he cannot manage to be patient on the self driving front. Maybe some form of mania, euphoria of success..
I think it is because traditional auto manufacturers are catching up with the electric car game. No other company will bet their reputation on self driving cars, because their reputation is much more important than their electric cars' success. They can wait longer and try different variations until their electric car market grows profitable. Tesla on the other hand has to differentiate and stay ahead of the curve so it doesn't get trampled. That's why he is so obsessed with self driving.
6 year Tesla owner here (Model S then Model X which is my current primary car). It is extremely rare for me to drive without AutoPilot; parking lots are pretty much my only AP-free zones. I suspect this is true of most Tesla owners; it certainly is true of the ones I talk to (friends and coworkers). From watching Autopilot, it seems to me that FSD is a long way away. HOWEVER, Autopilot is way more attentive than a human. It is not distracted by changing the radio station or AC, kids in the back seat yelling, or eating on the go. The 10x number is not surprising to me and seems plausible simply because Autopilot helps address deficiencies in humans.
> It is extremely rare for me to drive without AutoPilot; parking lots are pretty much my only AP-free zones.
Even if the average Tesla driver is like you and rarely turns it off, and even if the average Tesla driver is equivalent to the average driver and drives under equivalent conditions, the choice of when it is turned off can have a huge bias on the result.
Are you mostly driving around a suburb or a crowded city? In my experience it works well on highways and suburbs, but not in big cities and also busy highways.
An alternate hypothesis is that auto pilot might be better at following cars in front of it than driving on open road... Definitely lots of confounding factors here.
I own two Tesla vehicles with FSD. One misconception by people without experience with Tesla is that it is not Human vs. Machine. Rather it is Human AND machine. In other words we are not at the point where the car drives itself, rather the Human also must be engaged all the time so each system backs up the other. The Tesla system does allow for the Human to switch focus for a couple moments to change the music or read a text message which in a normal car just means the Human is distracted for a moment. The Tesla system is definitely not fully self driving by any means. I have experienced numerous times wher the car has reacted in strange ways or simply canceled self driving and told me to take over immediately. As a result I don’t trust it very much and I pay as much attention when the car is driving as when I am driving alone. The vehicle also forces constant interaction as it drives to reinforce that point. I am not surprised that “human + machine” is better than just human alone.
That may be a common misconception, but lots of the criticism is about the attention system not working all that well (they've improved this I think), and about the cartoonishness of advertising the feature as "Full Self Driving" without that being the case.
He explained the point. Just because the system can't drive me to work while I get an extra hour of sleep doesn't mean it can't prevent a car crash while I'm trying to find the right Steve to call in my contacts list.
The car is dumb, but always on. Humans are smart (well, relatively) but easily distracted. Both sides compliment the other.
The point, whatever it is, seems to result in a considerable improvement, counted as 10× by the report in question.
I didn't read the actual report. Are the changes even across types of accident? Specifically, does it count and report a 90% reduction in the number of pedestrians and bicyclist run over?
This is unconscionable. Are they really comparing Autopilot in Autopilot-able conditions (the easiest possible driving conditions), to all the rest of the driving, for which humans need to take over because Autopilot can't do it ?
This is so misleading... it's huckstery. Such bad faith should totally kill everyone's trust in Tesla and Musk.
Interesting timing, as last night a Tesla crashed in my neck of the woods, bursting into flames and killing the 2 passengers. No one appeared to be driving, so it seemed they were stupidly relying on Autopilot.
This is the reason Musk exaggerates in the other direction. For some reason* there is a fixation in the media such that every Tesla accident is a story. You wouldn't know the last time a Ford crashed off the top of your head.
Per [1], there are 22.5 billion miles driven in Teslas. Per [2], there have been 171 fatalities in Teslas. This comes to 0.76 deaths per 100 million miles driven. Wiki says US deaths per 100 million miles is about 1.1.
Tesla drivers are wealthier, and they're driving new cars in safer places. So this comparison is not one to one. But evidence does not support the claim that Teslas are less safe than anything else.
> .. batteries kept reigniting. At one point, Herman said, deputies had to call Tesla to ask them how to put out the fire in the battery
FWIW this was what happened when there was a Tesla accident in my area. But for sure, these are only anecdotes. We have to compare the data with how often non-Tesla cars burn
That article does not make it clear if there was actually no human in the driver's seat. It says two men died, and it says there was a person in the front passenger seat and a person in the back seat. That implies, by omission, that there was no person in the driver's seat. Was there a third person who did not die, a person sitting in the driver's seat?
Obviously if there was no human in the driver's seat, these guys were "looking for trouble". And failing to negotiate a curve at an excessively high speed isn't something the autopilot should be expected to deal with (unless it has precise location data and precise map data which informs the reasonable limits of performance... but I don't think it has these).
It's absurd if people want to blame Tesla for this.
The world, particularly the US, is full of over-promised marketing that borders on outright lying.
How many products and services on the market in the US use bold names and labels to suggest or even directly state a feature that in actuality doesn't come close to the expectation?
But on the autopilot topic, I would say it is you and the general public that has deemed "autopilot" to mean something that it historically does not mean. The most common use of autopilot is in aircraft, and in that domain it can be very crude and simple or highly automated. But with the exception of advanced autopilot systems in modern commercial jets, typical autopilot system merely hold a set heading, pitch, and level.
Even the FAA says this: "While the autopilot relieves you from manually manipulating the flight controls, you must maintain vigilance over the system to ensure that it performs the intended functions and the aircraft remains within acceptable parameters of altitudes, airspeeds, and airspace limits."
There are a lot of modern devices which can be very dangerous to use without some education. I would think someone willing to trust their car to drive them with no supervision is just dangerously foolish, and is possibly the same kind of person who might accidentally kill their buddy by flying a drone too fast and hitting him with it.
You can't dummy-proof everything. I doubt that the Tesla owner's manual says, "This autopilot system does absolutely everything. You can't crash it!" Instead I believe it tells you all the rules that you need to follow when using the autopilot. If you don't follow the instructions, that's on you.
Despite what Elon would have you think Tesla doesn't have autonomous cars. The article points out Tesla only logged 12 autonomous miles during the reporting period.
"Autopilot" and "Full Self Driving mode" are just dangerous marketing gimmicks that give people a false sense of what they actually do.
Driving conditions aside, they should also exclude cheap cars. Cheap cars, and also those that are popular with young drivers have higher accident rate.
I'm sure S-Class has lower accident rate than VW Golf. But that doesn't mean anything.
Twitter talks about fighting bots, while every time you open one of ElonMusk (or any high profile account) tweets you see lots of them. That definitely doesn't give a good feeling about the platform to your users, Twitter.
Some of the car commercials showing new assisting features make me laugh. It's often shown as special effects in the form of futuristic radar, or grid reaching out. The driver is shown with a confident yet smug look. The next shot some mundane everyday traffic incident that's blown out of proportion as if it were going to be tragic. The end is the happy driver showing relief their inattention and incompetence as a driver was thwarted by technology preventing their death.
I think by this point, people have largely reached an equilibrium in terms of their individual judgement as to his credibility. That equilibrium appears to be bi-modal across the population, with each side seemingly unable to comprehend why the other group holds that opinion.
Should it come as any kind of surprise when phrased that way?
Parking lots and snow-covered roads are high crash-rate per mile driven scenarios to which autopilot is (largely) not exposed.