Longwave LIDAR burns out camera sensor -- a big problem in the making?
A bit of a stir was caused at CES when a robocar developer and photographer (Jit Ray Chowdhury) shooting the demo car of AEye LIDAR found his camera sensor was damaged. People wondered many things, including if this said anything about eye safety, and also what it means or cameras in streets crowded with robocars. I worked briefly with Jit when advising Auro Robotics, now a part of Ridecell.
I asked the folks at AEye about it, and they said they suspected the photographer might have put his camera right up close to the LIDAR, but the images taken by Jit are from a reasonable distance. However, he did get close up at certain times with his f/2.8 lens. I do know that I and thousands of other photographers have made photos of operating 1.5 micron LIDARs without this damage, so it appears to be a rare event, at least for now.
However, multiple requests for more information from AEye have not been responded to, so for now, it may be wise to avoid their devices.
1.5 micron and 900nm bands
LIDARs work in 2 main infrared bands. The 900nm band -- commonly 905nm -- is similar to visible light, but not visible to the eye. It can be picked up by silicon chip sensors, which is why people use it, since silicon sensors are a very well understood and cheap technology. Even at 905nm (and worse at 990) we are starting to get to the region where silicon's band gap is too big for these photons, and sensitivity drops. By the time you get to the longer, less energetic 1500nm band, silicon doesn't react at all.
There is another big difference between the bands. 900nm band, like visible light, is focused by the lens in the eye, and a laser beam becomes a thin spot on the retina. This defines how much power a laser can put out and still be "eye safe." It has to be the case that even if you get right up to the laser and it sends a beam that gets focused onto a spot on your retina, it won't hurt it. And it won't hurt it, even if it keeps coming back every 10th of a second like LIDAR beams do. This puts a limit on how bright the beam can be, and that limits how bright the returning reflections of that beam can be when they get back to the LIDAR. This affects the range of the LIDAR. For objects 200m away, due to the inverse square law, just a few photons make it back. Advanced LIDAR design is all about making sure you can detect those dwindling photons.
1.5 micron light is quite different. Aside from not triggering silicon -- which means you have to use more expensive and less common materials such as InGaAs detectors -- this light is absorbed by water. Your cornea is mostly water, just like most of you. So the light is absorbed there and not focused into a spot. Just as regular sunlight doesn't hurt your skin, but a magnifying lens can burn paper, now the laser beam, when it hits your eye, is spread out over a wider spot and you can put out a lot more power without hurting the eye -- and you won't hurt the retina.
A lot more power -- which means you can get a lot more photons back from that target 200m away. In fact you can see 400 or even 600m away with the right techniques. Which many robocar developers are very keen on since more range is generally pretty good.
Other approaches are used to keep the laser eye safe for both types. The beam is always moving, and there are interlocks (physical if they can be) which halt the beam if it ever stops moving fast. When it moves that fast it never spends very long aimed at your eye.
A camera lens is not made of water. It focuses the light where the cornea just blocks it. So a camera sensor gets vastly more energy than any spot on the eye or body gets, put into a tiny spot -- where it can burn. But because this is the first report of this, we know that usually it doesn't burn even the camera sensor.
So what was the special circumstance?
The photographer had a Sony A7rII -- the same camera I have. This is a full frame camera, and the photographer was using an f/2.8 lens on it, which is only moderately big and fast, but larger than that found on smaller cameras.
It is important to learn just what caused this damage, when damage does not normally happen. If we don't figure it out, then people will fear 1550nm LIDARs in general, and AEye brand LIDARs in particular. Here are some of the culprits:
- The f/2.8 lens is larger than what will be found on pocket cameras, phone cameras and in-car cameras. But there are many larger and faster lenses out there.
- The photographer got fairly close to the LIDAR, but many photographers do. LIDARs of this type send out short bright pulses so over most distances you get the same energy from them until you are far enough away that the pulse is larger than your lens. Up close you can get multiple pulses per scan.
- AEye was running a demonstration where they used their LIDAR to try to track a nerf dart fired fast through the air. Jit photographed this and suspects this is where the problem took place. The nerf tracker possibly used a different scanning pattern and rate from automotive operation, possibly putting more energy into the camera lens than normal operation would.
My guess for now is there was something about the nerf dart scan system which emitted a highly atypical scan pattern, and that the company did not calculate that this would put out an amount of energy that was not camera safe. Normally, cars don't need to see things moving close to you as fast as a dart, but they do like the "foveate" which is to say, concentrate their attention on spots and areas of interest and put more energy into them.
Let's hope that what it was, otherwise...
What about all the cameras?
Even if our eyes are safe, we don't want a world where any big lens pointed at a road full of cars spells doom for the cameras. Robocars depend on cameras, of course, and we can't have them being burned out. Most robocar cameras use smaller lenses than a high-end camera like the A7RII. Cell phones use even smaller ones.
There are things camera makers can do to prevent this. It's easy to have filters which block long infrared light and don't affect the visible. Most cameras have filters for near infrared (like 905nm) but don't bother filtering the long light which won't trigger their sensor.
Still, it is not the duty of camera makers to do this protection (though they might want to do it to be on the safe side) and in any event it will take a long time to make that adjustment. Cameras meant for cars can adapt more easily. A world where photographers dare not take pictures of streets full of cars without a special filter on their lens would create a conflict between photographers and improved car safety. While the lasers are eye safe and thus legal, destruction of a sensor is still a tort that could possibly be sued over. A few successful lawsuits would give car makers and LIDAR makers pause, though from a public policy perspective, automotive safety would probably win.
LIDAR makers could distribute free long-IR filters to owners of large lenses. AEye offered to replace Jit's camera, but that's not an answer that scales unless the event is very rare, and even if it is, photographers will be paranoid. And having to put on filters is not a happy solution either. It could be this only happens at a magic distance, when the laser spot is smaller than the lens.
There are some companies making continuous wave, rather than pulse LIDARs. These include Blackmore, whose demo car I road in. The instantaneous energy of the continuous wave is lower and the return signal is integrated over time. Blackmore states this is far less likely to damage a camera, and of course is also eye-safe as is legally required.
Add new comment