Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've worked on 3d vision systems before and have experimented using different types of sensors in multi-sensor setups, and I'd argue that overwhelmingly it's a software problem.

More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.

I do agree though - assuming processing power isn't a performance bottleneck here, more data is always better. But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.



Completely off topic, but there's a fascinating list of animals with more than two eyes. For example the four eyed fish which uses one pair of eyes to see below water and one pair to see above water, when it's swimming along the surface.

https://wildlifeinformer.com/animals-with-more-than-2-eyes/


> More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.

Unless other sensors break, then they're suddenly useful redundancy. In a safety critical product like this, that alone should preclude the removal of sensors.

> But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.

You still get two, both for depth sensing and for redundancy; and some animals like spiders have 6-8 eyes.


>But there's a reason humans only have two eyes

You also have ears, a body that allows to move your face, your eyes can move too, and the brain has a model of the world including a model on how people thing and how would they act.

So will your Tesla have some eyes that can turn and look in all directions? would look cool to have snail like eyes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: