New 'Electric Eyesight' System Lets Driverless Cars See in Fog
In the past, autonomous vehicles relying on light-based image sensors have had difficulty navigating in blinding conditions such as fog.
SEE ALSO: ARE SELF-DRIVING CARS THE FUTURE OF TRANSPORTATION? [INFOGRAPHIC]
New MIT research from MIT has developed a sub-terahertz-radiation receiving system that might help driverless cars ‘see’ when other systems fail.
The vision system uses sub-terahertz wavelengths, which are between microwave and infrared radiation on the electromagnetic spectrum.
These wavelengths can be detected easily through a fog and thick cloud. Infrared-based LiDAR imaging systems that are typically used in autonomous vehicles struggle when visibility is low.
New research results in smaller and more powerful system
The new system works by sending out an initial signal through a transmitter; a receiver in the system then measures the absorption and reflection of the rebounding sub-terahertz wavelengths.
[see-also]
A processor then recreates an image of the object in front. Until now implementing sub-terahertz sensors into driverless cars has been tricky.
A really strong output baseband signal from the receiver to the processor is required for the system to work and traditional systems capable have been large and expensive to be implemented in autonomous vehicles, while smaller on-chip sensors have been too weak.
The work from MIT has resulted in a two-dimensional, sub-terahertz receiving array on a chip that much more sensitive than anything achieved before.
It can easily capture and interpret sub-terahertz wavelengths in the presence of a lot of signal noise. To achieve their breakthrough, researchers used a scheme of independent signal-mixing pixels -- called "heterodyne detectors" - a method usually difficult to integrate into chips.
'Electric eyes' will give better vision to cars and robots
The heterodyne detectors were made at a tiny scale so they could fit on a chip. The prototype of the project has a 32-pixel array integrated on a 1.2-square-millimeter device.
The pixels are around 4,300 times more sensitive than the pixels in today's best on-chip sub-terahertz array sensors. Further development of the chip could make it useful for integration into driverless cars and autonomous robots.
"A big motivation for this work is having better 'electric eyes' for autonomous vehicles and drones," says co-author Ruonan Han, an associate professor of electrical engineering and computer science, and director of the Terahertz Integrated Electronics Group in the MIT Microsystems Technology Laboratories (MTL).
"Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough."
The research radically rethinks the approach to designing this kind of technology. It’s creators hope to continue their research and continue to improve the powerfulness of the sensor. Accurate vision in all weather and visibility conditions is essential for future fully autonomous vehicles and robots.
Lowering the cost and increasing accuracy is essential for the sub-terahertz-radiation system to be broadly implemented.
The full paper can be read online in the Feb. 9 edition of the IEEE Journal of Solid-State Circuits.
To talk about what exactly the new bots are going to change for the likes of you and me, let alone processes, capabilities, and industries the world over, we caught up with Niek van der Voort and Max Richardson, co-founders of JedAI Studio