Hmmm, perhaps a more-valuable representation would be how the average Waymo vehicle would place as a percentile ranking among human drivers, in accidents-per-mile.
Ex: "X% of humans do better than Waymo does in accidents per mile."
That would give us an intuition for what portion of humans ought to let the machine do the work.
P.S.: On the flip-side, it would not tell us how often those people drove. For example, if the Y% of worse-drivers happen to be people who barely ever drive in the first place, then helping automate that away wouldn't be as valuable. In contrast, if they were the ones who did the most driving...
If only it were easier to get the stats in the form of "damage in property/lives in the form of dollars per mile driven", that would let us kinda-combine both big tragic events with fender-benders.
(Yeah, I know it means putting an actuarial cost on a human life, but statistics means mathing things up.)
Putting aside Waymo specifically for a second (whom I believe is the leader in the space, but also self operates their own custom cars).
If the current state of commercially available ADAS was dramatically reducing accident rates, then Teslas etc would have lower insurance rates. And yet they instead have higher insurance rates.
AFAIK, it's due to things like single frame construction and expensive + backlogged parts which you order directly from Tesla (as opposed to, eg, a drivetrain that may be made for 3 separate manufacturers).
Or, when you do have an accident it's typically more expensive to repair.
I think my car insurance policy actually does detail what they believe every part of your body + your life to be worth, it might be my old policy though. From memory an arm was £2,000
[Edit]: found the policy:
death: £2,500
arm or leg: £2,000
blindness in one or both eyes: £2,000
As my father quipped to me when I was younger: 'You know the best thing about a three-legged dog? It's not sad about the limb it's missing: it's happy for the three it still has.'
I don't think that's possible. I don't think this is a "cooperate greed, nobody wants to end the gravy train by starting a price war" situation. I think it's a "the myriad of stuff you have to do to run a compliant company sets the price floor" situation. The fact that there is no nuclear "well I guess I just can't afford insurance, if I lose my house so be it" option available to customers prevents it all from caving in.
Perhaps the best way to address this would be to look at property damage for car-car or cat-object collisions, and a separate stat for car-pedestrian accidents.
In collisions that don't involve pedestrians, the damage to the car/object is generally proportional to the chance that someone was badly injured or killed in those cases - the only thing you get by adding human life costs is to take into account the quality of the safety features of the cars being driven, which should be irrelevant for nay comparison with automated driving. In collisions that do involve pedestrians, this breaks down, since you can easily kill someone with almost 0 damage to the car.
So having these two stats per mile driven to compare would probably give you the best chance of a less biased comparison.
It may be more fair to compare them to Uber drivers and taxis and at least on that comparison haven't ridden in thousands of Uber and taxis and a couple dozen waymos, it is better than 100%.
Anecdotal of course but within my circle people are becoming Waymo first over other options almost entirely because of the better experience and perceived better driving. And parents in my circle also trust that a waymo won't mow them down in a crosswalk. Which is more than you can say for many drivers in SF.
No, tailgating would be a significant cause of the crash.
A driver -- legally, logically, practically -- should always maintain a safe following distance from the vehicle in front of them so that they can stop safely. It doesn't matter if the vehicle in front of them suddenly slams on the brakes because a child or plastic bag jumped in front of them, because they suddenly realized they need to make a left turn, or mixed up the pedals.
Oh, I fully agree—like I said, legally they're not at fault, because you'd more or less have to be tailgating and/or inattentive to crash into them just for braking unexpectedly.
But if there's an existing system and culture of driving that has certain expectations built up over a century+ of collective behavior, and then you drop into that culture a new element that systematically brakes more suddenly and unexpectedly, regardless of whether the human drivers were doing the right thing beforehand, it is both reasonable and accurate to say that the introduction of the self-driving cars contributed significantly to the increase in crashes.
If they become ubiquitous, and retain this pattern, then over time, drivers will learn it. But it will take years—probably decades—and cause increased crashes due to this pattern during that time (assuming, again, that the pattern itself remains).
Tailgating causes a great number of accidents today, no autonomous cars needed.
While tailgating is tiny slice of fatal collisions -- something like 2% -- it accounts for like 1/3 of non-fatal collisions.
We're already basically at Peak Tailgating Collisions, without self-driving cars, and I'd happily put a tenner on rear-end collisions going down with self-driving cars because, even if they stop suddenly more often, at least they don't tailgate.
And it's entirely self-inflicted! You can just not tailgate; it's not even like tailgating let's you go faster, it just lets you go the exact same speed 200 feet down the road.
Assuming a driving culture where other people won’t instantly insert themselves into the empty space in between, yes, it’s the exact same speed. I’d very much like that.
> You can just not tailgate; it's not even like tailgating let's you go faster, it just lets you go the exact same speed 200 feet down the road.
Preach.
I was coming home a few evenings ago in the dark, and both I and my passenger were getting continually aggravated by the car that was following too close behind us, with their headlights reflecting in the wing mirrors alternately into each of our faces.
As a pure hypothetical what you propose is possible, but there’s actual crash data to look at so there’s no need to guess.
Waymo’s crashes that I’ve looked at have just been fairly typical someone else is blatantly at fault no unusual behavior on Waymo’s part. So while it’s possible such a thing exists it’s not common enough to matter here.
Ex: "X% of humans do better than Waymo does in accidents per mile."
That would give us an intuition for what portion of humans ought to let the machine do the work.
P.S.: On the flip-side, it would not tell us how often those people drove. For example, if the Y% of worse-drivers happen to be people who barely ever drive in the first place, then helping automate that away wouldn't be as valuable. In contrast, if they were the ones who did the most driving...