Researchers at the MIT Media Lab have developed a new imaging system that can gauge the distance of objects obscured by fog so thick that humans can't see through it.
The goal is to integrate the technology into self-driving cars so that even in bad weather, the vehicles can avoid obstacles.
The imaging-sensing system uses a time-of-flight camera, which fires short laser bursts toward an object. It then counts how long it takes for the light to bounce back. Fog typically scatters the laser light, making it difficult for autonomous vehicles. But the researchers developed an algorithm that finds patterns in the scattered light to reveal distance.
At MIT Media Lab's Camera Culture Group, researchers tested the system in fog much denser than what cars would face in the real world. The system performed better than human vision, whereas most imaging systems perform far worse. A navigation system that was even as good as how a human driver handles the fog would be a huge breakthrough for autonomous cars.