Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Waymo requires interventions about that often driving in San Francisco as well from my experience over many trips. Their interventions are automatic when the car calls back to home base to make a determination as to what to do next and the operator makes a choice on how to proceed. Happens about once every half an hour travelling on Waymo in SF for me.



I've taken hundreds of Waymo rides in SF, LA and Phoenix and the car only needed assistance one time when it got stuck in the middle of a protest.


I wondering if OP is assuming when an intervention is happening rather than actually knowing it is happening.


Or when there is an unmapped construction site, or when there is anything not mapped correctly.


In my experience Waymo cars handle all sorts of unmapped and unpredictable situations very well. New construction sites, merging lane closures, detours, accidents, road signs that have been defaced or knocked down, malfunctioning signal lights, jaywalkers, bike riders running red lights, delivery trucks blocking lanes, double-parked cars, even missing manhole covers, debris in the roadway, etc.


I just have to call out how absolutely absurd this statement is as an example of a lived West Coast USA stereotype.

Of course, the self-driving car produced by silicon valley meets its natural predator in the wild of California -- a protest! Hysterical. You can't make this stuff up. Absolutely fantastic example of life being better than fiction.


I took four trips the other day and couldn’t notice any interventions at all. So maybe they are just so sneaky that I can’t notice.


How do you know when a Waymo vehicle is receiving guidance from remote support?


>> Their interventions are automatic when the car calls back to home base to make a determination as to what to do next and the operator makes a choice on how to proceed. Happens about once every half an hour travelling on Waymo in SF for me.

> How do you know when a Waymo vehicle is receiving guidance from remote support?

If it calls back for guidance like that, I'm sure there'd have to be a noticeable delay where the car stops and waits for a response. It's going to take at least 10-30s for a human to context switch into the car's situation and input the guidance, probably longer.

To get a faster response from a human, the human would have to be continuously engaged with the car's driving task.


If the need for intervention is predictable, like a problem intersection that's difficult to avoid, they could conceivably assign an operator early, so they have time to engage and provide support immediately when required.

It would pretty neat if the Waymo itself could accurately predict the need for assistance in the near future as well. But I'm sure that requires pretty specific conditions.


The car tells you verbally and shows you on the screen that it is calling back to base to ask for assistance "to get you moving again".




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: