The reason why they decided to remove ultrasonics is that combining contradictory signals from cameras and ultrasonic is not straightforward. What should have the higher precedence? So they ultimately decided that cameras can do what ultrasonics can do, but even better.
They won't remove ultrasonics from the delivered cars, but they will probably disable them in coming months.
>The reason why they decided to remove ultrasonics is that combining contradictory signals from cameras and ultrasonic is not straightforward.
It's fairly well known that nearly any kind of sensor fusion 'is not straightforward'. If that's a fact that was just recently discovered by Tesla -- causing a recent change in their lineup only now -- .... well, i'll just say that it speaks to Tesla's confidence being greater than their technological prowess.
I don't think that's the underlying issue here myself. I think this change is likely caused by an effort to get away from certain vendors and ship cars more rapidly.
>What should have the higher precedence?
again, this isn't a new problem and there are volumes of resources regarding that. It's a tough problem, but not new.
How could vision possibly replace the front bumper ultrasonics? Their primary function for me is showing proximity to objects at low speed, and many of those objects aren’t visible to the front camera because the hood is in the way.
The decision about what sensor to use is more easily made when one of the sensors provides no data, or clearly disturbed data. Like in case of fog, mud on a camera, pouring rain, etc.
My phone often signals me that I have to clean the lenses of my camera, so I'm sure it's possible to detect unreliable sensors. Sure, you have to decide on all sorts of tipping points, and in the end that is probably expensive to develop and the reason the sensors are dropped.
If fog or rain is so intense that objects within a few feet of your car are obscured from vision, I'd suggest that maybe you shouldn't be operating the car at all, regardless of any assistance technologies it might possess.
My brother's Tesla signalled that the side cameras were faulty or needed cleaning when we drove down a country lane in the dark. This did not raise my confidence in the Autopilot system.
This sounds like a problem well suited for ML, which is basically the thing most of us expect Tesla to be best at. Just throw sensor data at cases and let the situations in which certain values should get precedence manifest in the weights and biases.
I expect Waymo to be the best at it. Waymo has a bunch of sensors on their cars to provide accurate information, because they know that cameras aren't always reliable. Tesla may not be right here.
An autonomous vehicle is operating on a road network designed for humans with eyes. If we are aiming for the kind of “common sense” required for parity with a human driver it seems intuitive that the vehicle would have sensory inputs in common.
Cars do not use ultrasonic sensors for autonomy. They are for low-speed situations (such as parking) and are designed to give more information than the driver is otherwise capable of gathering.
While I generally agree to get parity you would use common sensory input, I would hope the goal is to go beyond human level and navigate in for e.g. fog conditions or heavy rain, etc. things radar would help with. Ultrasonics probably don’t buy them much at this point as long as they have good object permanence. Think about parking in a garage. If you can map objects to special memory adequately then maybe ultrasonics aren’t necessary. However if the object is out of sight out of mind, you probably need the ultrasonics to help not hit it.
They won't remove ultrasonics from the delivered cars, but they will probably disable them in coming months.