RADAR, LiDAR and Cameras Technologies for ADAS and Autonomous Vehicles

An autonomous vehicle or vehicles with driver assistance systems usually has multiple sensors for monitoring its surrounding conditions, such as lanes, pedestrians, distance, ranging, parking and collision. These sensors act as the eyes of vehicles and are gaining importance in the vehicular vision systems combining these sensors, signal processing and controls.

Self-driving cars need plenty of eyes on the road and cameras, radar and even humans help to keep autonomous vehicles safe – Daily Chart by www.economist.com

In the applications of autonomous vehicles, two different approaches have been adopted by two groups. The group led by Tesla uses cameras combining with mm wave RADAR (Radio Detection and Ranging) technology. The other group including Google uses LiDAR (Light Detection and Ranging) technology. These three types of sensors are based on different technologies and they have been developed to carry out similar functionality and compete with other technology for the future market. Therefore, we have seen these sensors share a large portion of their capability. On the other side, they have also shown different limitations and strengths that may separate them from each other on special applications.

RADAR – Radio Detection and Ranging

RADAR technology has been used on vehicles for long time. Beginning in 1960s, engineers at Mullard Research Laboratories in England invented a RADAR system for using on cars. The systems operated at 10 GHz. Later, RCA used the same frequency in its 1972 RADAR system. Along the road to commercialize car RADAR system technology, manufacturers have shrunk its size by using small antenna array and gone from 34 GHz, to 50 GHz and then 77 GHz. The first commercial car RADAR system showed up on Toyota Celsior in 1997. Since then, many manufacturers have followed Toyota’s step, such as BMW, Jaguar, Nissan and Mercedes Benz.

Illustration of the relative antenna sizes for 24 GHz and 77 GHZ bands – e2e.ti.com

Mm wave RADAR can detect objects from very short range (Short Range RADAR SRR 0.2m) to long range (Medium Range RADAR MRR 30 – 80m and Long-Range RADAR LRR 80 – 200m). Today, two GHz bands are commonly used, 24-29 GHz band for short range and 76-77 GHz band for long range. The lower band is shared with other users and it’s restricted in power levels. According to ETSI (European Telecommunications Standards Institute and FCC (Federal Communication Commission), the use of the 24 GHz Ultra-wide Band (UWB) will be phased out by the year of 2022. Therefore, the 77 GHz band (soon to extend into 81 GHz to have 4 GHz sweep bandwidth comparing to 200 MHz in the 24GHz Narrow Band (NB)) is becoming more popular because of larger bandwidth and higher power allowed. Using higher band means better resolution working for both the short and long range. RADAR can work in almost all environmental conditions.

Automotive mmWave RADAR systems – by Brian Shaffer @ e2e.ti.com
LiDAR – Light Detection and Ranging

LiDAR is similar to RADAR in some way and this is evident that the term LiDAR was first originated in early 1960s combining the word Light and RADAR. LiDAR Scanning technology has been long used in surveying projects. Unlike sonar and radar that were used in surveying long before it, LiDAR uses focused light pulses, Laser. LiDAR system uses a laser source to send laser pulses that is invisible to human eyes on the target at a very high frequency, such as 900 kHz, and waits for the returning pulses reflected by the target. The time it takes for the light to travel from and back to the source is thus measured – by the method of Time of Flight:

Distance = (c x t)/2, where c is the speed of light = 299792458 meter/second, t is the total travel time

LiDAR can thus determine the points on the directions it sends out the laser pulses. This enable the system to map millions of points, called as point cloud, in one action. LiDAR also scans 360° continuously to get a full mapping of the point clouds for its surrounding environment. In the least form of the mapped 3D image of the surrounding environment, every pixel carries not only the information of the depth, but also the speed at which it’s moving.

A Velodyne LiDAR product produces a set of data points that show nearby cars – Even Ackerman IEEE Spectrum
High resolution 3D LiDAR detects the surroundings with precise measurements of distance and classification of object – velodynekidar.com
DV– Digital Video Camera

Unlike LiDAR and RADAR, automotive cameras are mostly passive systems that does not need the aid of modulated light sources but just rely on the natural ambient light to capture the image of its environment. Therefore, the cameras on vehicles work just like human eyes and they use the same technologies used by most digital cameras in smart phones. Tesla has made cameras one core component of its vehicular vision systems because it offers so many advantages. Cameras being the only technology can capture texture, color and contrast information simultaneously. Cameras can capture high level details that allow the system to use machine learning or AI (Artificial Intelligence) algorithms for further classification. Combining with the superior pixel resolution and low cost, all these features have made cameras the leading candidate for ADAS (Advanced Driver-Assistance Systems) and autonomous vehicles. Using cameras in ADAS has enabled new capabilities that are otherwise impossible with other technologies.

Automotive Imaging Cameras make your car safer – by Peter Labaziewicz @ e2e.ti.com
  • ACC (Adaptive Cruise Control) – can detect cars or trucks, need to identify motorcycles and keep distance;
  • AHBC (Automatic High Beam Control) – can automatically switch between high and low beams; needs to contour the ray of light according to the incoming vehicles;
  • TSR (Traffic Sign Recognition) – can detect speed limits and some other signs; need to detect supplemental signs and understand the context, such as speed limit in a specific time range, also it needs to detect the traffic signals to adapt to ACC, stop and start, slow down, etc.
  • LKS (Lane Keep Systems) – can detect lane markings; needs to detect drivable surface, adapts to construction signs and multiple lane markings;

Camera plus the RADAR is the approach taken by Tesla who believes this configuration is sufficient for autonomous driving.

In recent trend, all major autonomous car manufacturers and the new coming startups are using a combining approach that puts LiDAR, camera and RADAR together to gain the best use of them all. Cameras are much less expensive than LiDAR systems and there are so many off-the-shelf products to choose. Also, cameras are very immune to weather conditions, such as fog, rain and snow. The cameras see the environment in exactly the same way as   human eyes, thus it’s easier to interpret the result. This is useful in realizing the machine recognition of traffic signs in the human vision style.

Limitations of Cameras can be easy understood as they present the similar issues as human eyes. For example, it causes confusion in situation when recognizing bright light from the sun or incoming cars. Therefore, we need a RADAR to help cameras. Also, images captured by cameras don’t provide depth or distance information automatically like those by 3D LiDAR sensors. To extract the distance information from camera images, we must rely on machine learning algorithms, such as neural networks and deep learning, which require powerful hardware support.

Difficult surrounding scene classification for Automotive cameras – by Peter Labaziewicz e2e.ti.com

There isn’t a single technology that can outperform all others. Combining one major technology with one or more other technologies seems a viable way for gaining safety redundancy when there’s no guarantee of that by anyone. According to Ali Ors of NXP Semiconductors, Camera systems that show obvious advantages in providing color and texture information will see the largest volume growth up to 400 million units by 2030. With going down cost, LiDAR and RADAR systems will see large growth and reach 40-50 million units deployed by 2030.

 

Share What You’ve Learned

Leave a Reply