Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Aren't the optical paths highly selective for direction, time, and wavelength? Let's assume coincident-wavelength for our worst case. At any given point in time, it's as if each car has a laser pointer aiming a single dot somewhere, and that single dot is also the only point it's receiving light from (in stark contrast to a camera, which is gathering light from its entire field of view). Even if the LIDARs can see each other, there's no 0-attenuation path from the output of one to the input of another unless a pair of LIDARs have chosen, out of their entire FOV, to aim directly at each other.

So, if we define "traversal time" as the time required for the dot of car A to sweep the aperture of car B, in each traversal time there's 1 chance in N^4 of perfect alignment, where N is the ratio of dot size to the full FOV. Maybe it happens once in a blue moon, but if the sensor can withstand full illumination for more than one traversal time, you would start having to compound 1/N^4 events in order to fry a sensor. I wouldn't count on it.

Well, I wouldn't count on that particular mechanism, at least. I'm sure there will be cases where a coating degrades and lets broad-band sunlight in, or a sensor parks on the sun, etc. Sensors will get fried, but not because the engineers making them were too stupid to consider interference.



from TFA:

>Crucially, self-driving cars also rely on conventional cameras. So if those lidars are not camera-safe, it won't just create a headache for people snapping pictures with handheld cameras. Lidar sensors could also damage the cameras on other self-driving cars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: