Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So the key question is how much of an improvement does radar/sensors/etc give you over just using computer vision?


As someone working in the field, I would never choose to eliminate the information provided by radar, lidar and any other sensor technology. Depending only on camera information would be too limiting.


You haven't solved the problem though.


If we're to trust what Elon and the team said during the last few AI day, none. They stated that the ultrasonic and radar sensors were actually performing worse than their pure vision stack.


Real life performance of vision only stack doesn’t agree with it.


I‘m ready to be convinced that this will be true at some point for the ultrasonic sensors. But by design the radar can see things that vision can never see. It seems like a bad idea to take that away.


Right but I think the signal to noise ratio was eating to much compute for what was little payoff. And either the ultrasonic or radar don't even work above 10mph, I forget which. They are used purely for parking.


Vision systems don't work at all in fog or heavy rain/snow.


Up to a certain degree they work, as humans can drive in fog or heavy rain/snow as well. If visibility is so bad that a human wouldn't be able to drive, I wouldn't want to sit in a self-driving car either, no matter if it does use vision only or has additional sensors.


Any more or less than the human equivalent?

I'm not following the news, but I haven't seen any videos set in what Canada looks like 4 months per year.


A better "answer" might be to make them an option and let the market decide.

For many (MANY) years airbags were fought by the auto industry even though people wanted them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: