As someone working in the field, I would never choose to eliminate the information provided by radar, lidar and any other sensor technology. Depending only on camera information would be too limiting.
If we're to trust what Elon and the team said during the last few AI day, none. They stated that the ultrasonic and radar sensors were actually performing worse than their pure vision stack.
I‘m ready to be convinced that this will be true at some point for the ultrasonic sensors. But by design the radar can see things that vision can never see. It seems like a bad idea to take that away.
Right but I think the signal to noise ratio was eating to much compute for what was little payoff. And either the ultrasonic or radar don't even work above 10mph, I forget which. They are used purely for parking.
Up to a certain degree they work, as humans can drive in fog or heavy rain/snow as well. If visibility is so bad that a human wouldn't be able to drive, I wouldn't want to sit in a self-driving car either, no matter if it does use vision only or has additional sensors.