"Every vehicle-to-vehicle contact event in the first one million miles involved one or more road rule violations or dangerous behaviors on the part of the operator of the other vehicle."
That agrees with the California DMV reports. The usual problem is being rear-ended at slow speed when the autonomous vehicle detects a threat at an intersection and stops. That shows up over and over in California reports. The Arizona data has five occurrences of human drivers backing into a stationary Waymo vehicle, mostly in parking lots.
That doesn't seem to be happening in California, probably because picking up people in parking lots isn't that common in San Francisco.
As more human-driven vehicles get auto-braking, the rear-ending problem will probably decrease. Really bright brake lights that flash when the autonomous vehicle detects a rear-end threat might help.
Curious if the car weighs the risk of getting rearended when it decides to be cautious in situations like this. I have a feeling human drivers do to some degree.
>I have a feeling human drivers do to some degree.
Sure. In general, people will try not to hit the brakes any harder than they have to especially if they have the situational awareness to know someone is close (maybe too close) behind them.
If you tend to drive too close behind people, you will see this happen a lot - people really don't like getting involved in highway pile-ups, so it's better to give you a brake check than to wait until it actually matters. However, it is very rare for people to "brake check" you unless you do that.
> I guess it's easier to do when you have a stellar safety record.
... for the conditions that the Waymo vehicles operated in.
Deployment of Waymo FSD across the board would seem to be a sure win in SF and Phoenix, so I hope that it gets widely adopted in those areas at minimum since I think it will save a lot of lives. There's a lot of work to be done by Waymo yet to get it to work in other areas of the country and conditions, though.
SF is not an easy locale to test in. Sure, there is no snow, but the city is narrow, dense, with a lot of challenging situations including fog, rain, and crackheads. This success seems promising.
SF is a terrible city to drive in. Narrow streets, terrible hills, a huge diversity of traffic: pedestrians, cyclists, electric scooters, buses, light rail, trains. Highly congested. Plenty of tourists driving (it’s not like NY or London where people who are visiting know not to drive).
I've probably driven half a million miles and I don't remember having 10 incidents (I probably have, just don't remember). So I'd conclude I'm at least as safe a driver as Waymo. Even if not true, most people are going to feel that way when they see the data. Human nature.
In addition, 40% of the waymo incidents were while the waymo vehicle was parked. I expect that (1) most drivers are unaware of incidents that happen while their car is parked and (2) waymo vehicles spend more time parked in less protected places (i.e. in parking lots waiting for riders to appear/enter/exit and not in a parking spot).
The ones where I'd be most interested in the videos were the ones where the Waymo was stopped/slow/preparing for a turn and got hit in the back (4, 10, 12, 14). While likely the fault of the human driver, hesitant/unexpected behavior by the Waymo could have contributed to those.
Three consisted of the Waymo hitting random objects - not catastrophic but clearly shows weaknesses in the self-driving: 3 (Waymo ran over a traffic cone), 17 (Waymo hit a "free swinging parking lot barrier arm"), 18 (Waymo hit a shopping cart).
Ten of the accidents seem unavoidable for the Waymo and clearly someone else's fault:
1 - distracted driver rear-ends Waymo coming to a stop, with a delta-v of 20 mph. A 0.2 g deceleration could be a bit faster than normal but definitely isn't slamming the brakes, and the report mentions at one point that the other driver was using a phone.
2, 5, 6, 8, 9, 11, 13, 19 - someone hits a stationary Waymo in a parking lot or while getting out of a parking spot. [Edit: Someone pointed out that these could be preventable for the Waymo, if it knew how to use the horn. https://news.ycombinator.com/item?id=34974313]
16 - Garbage truck tried to squeeze by a stopped Waymo, didn't fit (although for this one, it is possible that a human driver would have pulled over further).
Two were emergency braking situations - would be interesting to see what the reaction time was and how it compared with a human driver:
7 - someone cut off a Waymo and got rear ended
20 - Waymo hit a minibike that had crashed and tumbled into its lane, despite braking.
And of course there has to be a weird one, 15 - Wind blew debris into Waymo.
Overall, it seems like the report mostly avoids assigning fault, which is a good thing - in the end, how many crashes happen matters more than who was at fault. [Edit: and as the "use the horn" example shows, even when the other side is clearly at fault, improvements may be possible]
This is a breath of fresh air compared to the tesla approach which seems to be don't release any data that isn't biased. Waymo is being a responsible party here by releasing all this data even if some of the stuff around the waymo driver hitting roadway objects (like the parking lot bar or the shopping cart) is a little concerning.
What exactly is biased about the Tesla data? If anything it is far less biased imo cause it is over a massive sample of 9 billion autopilot miles as opposed to 1 million (more than 9000x the sample size)
Teslas turn off auto-pilot as soon as it finds a complicated situation, self-filtering better statistics. This also means they drive a shitton on highways, which are just simply ”easy mode”. A better statistic report would weight a mile of city driving conditions multiple times that of a mile on the highway.
None of that introduces bias - that is how the system is designed to work. My car has automatic cruise control that turns off in the rain. It does that FOR safety, not to boost any statistics.
Bias suggests incomplete data or poorly sampled or manipulated - this is none of that. These are raw statistics on the functionality of a system - if you want to introduce bias to weight certain miles more than others that’s on you and you can do any analysis you want but Tesla is reporting what they should be - the raw statistics.
Silly question: will the Waymo AV use the horn? I see a few parking lot incidents where another driver backed into the front of a stationary Waymo AV at ~2 mph. I think a human driver might have tried honking at the other driver, on seeing a slow-motion collision about to occur.
By comparison, collisions for human drivers are about 3.3 per million miles driven (higher for young drivers), and fatalities are about 0.01 per million miles driven.
It seems like a map showing the number of miles driven in each state and then in each metropolitan area within that state would be something an intern could do.
Followed by a table with # miles driven in various weather conditions.
This reminds me of a video I watched long ago that analyzes roughly how many miles would a self-driving car need to be driven so that we can have a certain confidence that they are driving better than humans. https://www.youtube.com/watch?v=yaYER2M8dcs
That's a pretty superficial take. The whole thing relies on the gigantic denominator of human miles, but humans are artificially inflating their average because overwhelming majority of miles driven are extremely simple trips on limited-access highways. Waymo, by comparison, racked up its million miles driving around the SF Tenderloin, in the dark.
Opposite to the claims seen elsewhere in this thread, which are omnipresent whenever the topic arises, it is in fact humans who are driving around on easy mode and self-driving car developers operate in the most difficult conditions they can find.
> Waymo, by comparison, racked up its million miles driving around the SF Tenderloin, in the dark.
You might be confusing them with Cruise. Waymo is driving around SF driverless 24/7 so the million miles also come from during the day, while Cruise is limited to nighttime.
> self-driving car developers operate in the most difficult conditions they can find.
Ehhhhhhh. While I agree that these aren't the easiest driving environments in the world, they are certainly mild climates. If I were in the business of proving that waymo vehicles weren't ready for prime time I'd be taking them up the alcan in winter or Florida during hurricane season, not Phoenix.
Sometimes it seems that Waymo is moving awfully slowly but I think they are being really smart to remain cautious and keep their safety record. One major accident like Uber had could destroy the whole program. Better to move very deliberately, and incrementally, and don't expand beyond the system's abilities.
tl;dr: 20 collisions. 8 where another vehicle reversed into a stationary waymo vehicle. 6 where a vehicle rear-ended the waymo vehicle. 1 where a garbage truck hit a stationary waymo vehicle. 5 instances of the waymo vehicle hitting an object:
Correction: five cases where a vehicle rear-ended the Waymo, one where the Waymo rear-ended another vehicle. In that case, the other vehicle merged into the Waymo's lane and immediately braked. Incident #7 on p11 of the report.
One of the reports says that the accident was caused by the other driver looking at their phone. Could potentially the sensors of the waymo detect that and warn the other driver?
"Hey guys, self-driving is a decent sprint but I think you should expand the scope to include detecting and modeling driver behavior in nearby vehicles. Add a quick study for HCI to warn drivers of their dangerous aberrant behavior and I think we might have something real here!" - tropsis, 2023
It's doing that by looking at the vehicle as the agent though; it has nothing to do with "the body language of this pilot suggests this vehicle will do X", it's just a predictive module because it's useless to be able to see another car if you can't predict it's going to go X units forward in the next time tick(s).
Adding a layer that models human drivers to augment the prediction of this module you link would be a waste of time.
Lol. I think the other comment is making a suggestion that isn't quite reasonable, but maybe it's adjacent to a reasonable ask.
Humans have a horn to warn other humans of unsafe behavior or conditions. We really only need to worry about warning in front of us. And we provide some warning to cars behind us in specific cases with hazard lights and brake lights.
The autonomous vehicles have a better understanding of the whole state. We've already talked about warning other autonomous cars with V2V, but maybe there's something easy/sane they can do to warn human cars behind them and further increase safety.
I think dedicating a small team on this would be a reasonable thing to do. It should be quite separate from the main self-driving task, and would reduce the number of accidents.
In the end, if I'm in a self-driving car, I care whether the crash happened, not who caused it. The injuries and hassle are mine either way, the financial damage isn't mine either way.
I mean if the other driver is using Android it should be possible to use some combination of license plate, phone GPS, Bluetooth proximity, and other means to identify who the other driver probably is and pop a “you are about to run into one of our cars, please loop up” message on their phone.
Fun fact: while daily generally lies on the rear-ending driver, intentionally causing an accident will always get the lion’s share of blame. This would bankrupt Waymo, not anyone else.
If you do that kind of thing, don’t expect it to work out well if the other driver has a dashcam. If they don’t, of course you can lie and deny having caused the accident.
How does Waymo compare to human drivers when it comes to driving style? I've seen some complaints that it is annoyingly slow and indecisive sometimes, e.g. taking forever to make a left turn. Is this something Waymo also analyzes? Any hard data on that?
And minus the times when it gets stuck and needs to be rescued by human driver either remotely or an actual Waymo guy coming over and getting in the car. Doesn't happen every time of course but it does happen.
Remote humans can't drive Waymo's vehicles. They can reach into the car's model of the world to correct something it misunderstood (e.g. car is confused by misplaced cone and thinks the entire carriageway is closed, actually only the left lane is closed) but they can't control the vehicle remotely at all. It's extraordinary to me how many HN readers seem to think that somehow remotely driving cars would be a good idea, but then I remember how many incredibly dumb start-up ideas get funded and it's less surprising.
That agrees with the California DMV reports. The usual problem is being rear-ended at slow speed when the autonomous vehicle detects a threat at an intersection and stops. That shows up over and over in California reports. The Arizona data has five occurrences of human drivers backing into a stationary Waymo vehicle, mostly in parking lots. That doesn't seem to be happening in California, probably because picking up people in parking lots isn't that common in San Francisco.
As more human-driven vehicles get auto-braking, the rear-ending problem will probably decrease. Really bright brake lights that flash when the autonomous vehicle detects a rear-end threat might help.