Add to that, the last bullet point from page 2 of the official preliminary report[1]:
At 3 seconds prior to the crash and up to the time of impact with the crash attenuator,
the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive
steering movement detected.
That to me is a strong indicator that Tesla's AP2 can not recognize a crashed attenuator, probably one of the strongest argument that AP2 without LiDAR is unsafe and fatal (let alone of missing the redundancy in sensors and brakes[2]).
It reads like he was following another car that was driving at a slower speed but once that car left his path, his car began to accelerate to the pre-set speed of 75mph.
It doesn't matter if he was following another car or not. If there is an obstacle, AP2 should brake and in that case it should brake hard or use evasive turning maneuvers to avoid head-on collision.
The problem is it's entirely possible the obstacle detection is un-fixable without swapping out the sensor suite the Tesla ships with. Tesla is quite clearly expecting the problem to be solvable with that sensor suite alone (this is more-or-less the line I was given by a Tesla seller when I was shopping for a car), so it's not going to be in the T-sheet for them to recall every Tesla on the road to fit a LIDAR to them, assuming that's at all feasible.
This happens with regular adaptive cruise control on my Ford, when the cruise control set point is a lot higher than the current speed: the car accelerates after another car clears the lane and ram into things.
Ford didn't call their adaptive cruise control "auto pilot", they call it as it really is: "Adaptive Cruise Control with Stop-and-Go".[1] By contrast, Tesla touted their ACC "auto pilot" with auto steering and clearly in this case it didn't steer correctly and proven fatal (at least 3 times in my memory.)
To contrast that anecdote, this doesn't happen with my Subaru with EyeSight. It uses parallax for 3D distance, if it detects a solid object ahead (be it vehicle, wall, barrier, or anything else) it will apply the emergency braking system (which may still result in a collision in the given circumstances but at much less fatal speeds).
The industry has many highly effective AEB systems, and has for years now. I cannot comment on Ford's because I don't know specifically about its problems.
My Subaru with EyeSight accelerates heavily when it no longer is tracking a car in front. This has happened once or twice in odd circumstances, but since I was paying attention to both the speed increase and road in front I tapped the brakes to disengage.
My Camry Hybrid sometimes picks up cars in the lane next to me and a car length ahead, to match speed with them. And sometimes it loses the car ahead of it on turns. I try to aim the top of the steering wheel at the car ahead of me, and this seems to help the sensor a great deal.
On turns, it doesn't accelerate, only maintains speed; can be frustrating if you want to act like a racecar driver. For example, following someone around a freeway curve at 45mph, and the other driver changes out of your lane. Despite having adaptive cruise set to 60mph (or higher), it will stay at 45mph until after the curve, and only accelerate after ~5 seconds of straight roadway.
It does sound the collision alert (sadly no autonomous braking here) very consistently when I'm accelerating or adaptive cruise is on, and there's a stationary object ahead.
> Despite having adaptive cruise set to 60mph (or higher), it will stay at 45mph until after the curve, and only accelerate after ~5 seconds of straight roadway.
Like all adaptive cruise controls, it will accelerate if you have the max speed set higher, but it will not ram into things since it's on you to steer.
A counter argument that HUMAN DRIVERS are unsafe an fatal (let alone missing common sense and alertness). I 100% get your point, but my counterpoint, is that this is what we are going to see, in the future. Autonomous systems that by and large are way safer, than human drivers, but when they fail, the might do so, in circumstances where are human would not. Humans are good at some things, computers/ai/sensor suites are good at others, and that is going to mean different kinds of failures, and we have to learn from this. The gigantic upside, is countless situations where the ai, detecs things, the human would not, and overall better outcomes. (Its also worth nothing, that this incident, was way worse because of the barrier. If the divider was in normal condition, the driver would have lived, in my opinion).
I get that in 5-10 years we MAY have SDVs that are superior than human drivers. But I think a lot of people have a mental model of SDVs _right now_ being superior to human drivers. I can't remember the crash stats, but I think the only way to know for sure is to calculate the number of miles a single human generally drives on average before they are involved in a collision or crash, etc...
Now I don't have the numbers memorized, but in previous Tesla, Uber, Google threads - only Google had reached records similar to human level crash safety. Keeping in mind however that they are still working on driving through snow, construction sites, non-existant (paint gone) lane lines... so the test areas are quite specific still.
I realize Tesla isn't advertising level 5 autonomy - I'm just responding to the above poster.
We don't have exact numbers for Tesla but there are 2 confirmed casualties in what is conservatively over a billion Autopilot miles. The average casualty rate for human drivers is something around 1.25 death per 100 million miles. Those numbers are not directly comparable given that Autopilot is not something that is used during 100% of all driving miles and Tesla drivers are not average car drivers. However, there is nothing that screams "Teslas with Autopilot are a death trap" like some people in this thread are implying.
Autopilot will only be engaged on roads and under conditions that are favorable to autonomous driving, but fatal accidents tend to happen more under unfavorable conditions for human drivers. The non-Tesla numbers also include cars that are smaller, cheaper and less sophisticated, so much more likely to lead to fatal outcomes in crashes. So these numbers are heavily skewed in favor of Tesla. The proper comparison would be waymos performance and they’ve been without a fatal crash afaik. As an alternative you could pull the numbers for cars of similar build on similar roads for human drivers.
I don't think it's fair to compare an L2 system to an L5 system, though. My sense is that a very good ADAS (L2) system will reduce overall fatalities, while causing a small number of additional ones through human misuse of the system. This accident seems like the latter case.
I wish Tesla would update those numbers more frequently, but it looks like they stopped doing that in 2016. However they did enough updates in 2016 that allow us to estimate a trend of something on the order of 1 million miles per day.
4/9/2016 - 47 [1]
5/24/2016 - 100 [2]
6/30/2016 - 130 [3]
10/7/2016 - 222 [4]
11/13/2016 - 300 [5]
This rate continuing linearly would put us at roughly 900 million today. A linear accumulation of miles ignores that Tesla has roughly twice the cars on the road that they had in 2016 when those numbers were published. I have no idea their current mileage total has reached, but I feel safe in saying it is over a billion miles.
Keep in mind that a billion miles is a very small number in this context. People in the U.S. alone drive well over 3 trillion miles every year [1]. We simply don't have enough data yet to say that autonomous vehicles are safer.
I completely agree and should have put something about sample size in that comment. However there are no red flags yet in the statistics and I think that is important.
Think of it like flipping a coin. We decide to flip a coin 4 times and get 2 heads and 2 tails. That wouldn't be enough to draw any meaningful conclusions about whether the coin is fair or if it is weighted to one side, but I would feel a lot better about that result than if the coin ended up on the same side 4 times.
They're claiming that human+Autosteer+TACC is safer than a human. They're not claiming that Autopilot while you use your phone or watch a movie is safer than human driving. And they remind you of this every time you engage the feature.
Consider AEB. AEB is safety improvement on pure human driving. But it'd be hugely foolish to rely 100% on AEB to do your braking for you. It's not designed for that -- it can help you in a lot of situations, but not all of them. The same is true for Autopilot.
If they where better now, than humans (fully sdv) than everybody would be selling them, using them etc. That wasnt my point. My point is, even wheb they become safer, maybe even many many times safer, we are going to see some accidents, that you could argue, a human would have avoided. And it will be terrible for the people involved, there will be outcries etc. But the sum, will still be that many many lives are saved.
I shouldn't have to keep saying this in every thread about Tesla autopilot, but I guess I do: the fact that there will probably one day be autonomous driving systems that are safer than human drivers does not imply that Tesla's current system is safer than human drivers.
I think it is safer. Its of course dependant on the situation ie. Highway cruising. But I think it is. (Human drivers are not that high of a bar, I see crashes at least weekly, on my stretch of highway)
And they play contradictory arguments using statistics "You don't have enough data to say it is dangerous!" while at the same time using the exact same statistics to argue that it is safer.
Either the statistics are half-baked or they're ready to draw conclusions with, you cannot have it both ways.
Not to mention repeatedly claiming that "it saves lives", when they're probably either talking about a completely separate collision avoidance system (which has probably stopped several accidents, by definition, but it isn't special and it doesn't move the car on its own); or they're talking out of their asses because the number of independent variables is the kind of thing you need a PhD statistician for and they'll probably just tell you to take a hike.
[1]: https://www.ntsb.gov/investigations/AccidentReports/Reports/...
[2]: https://arstechnica.com/cars/2018/04/why-selling-full-self-d...