LIDAR – sometimes called time of flight (ToF), laser scanners or laser radar – is a sensing method that detects objects and maps their distances. The technology works by illuminating a target with an optical pulse and measuring the characteristics of the reflected return signal. The width of the optical pulse can range from a few nanoseconds to several microseconds.
Figure 1 shows the basic principle of LIDAR, with light shining out in certain patterns and information extracted based on the reflections gathered at the receiving end. Pulse power, round-trip time, phase shift and pulse width are common parameters used to extract information from light signals.
Why choose light? What differentiates LIDAR from other existing technologies such as radar, ultra sonic sensors or cameras? What’s driving the hype around LIDAR? This white paper addresses these questions in the context of long-range LIDAR, which is going to be an important sensor for autonomous driving. In addition to autonomous vehicles, LIDAR has applications in 3D aerial and geographic mapping, safety systems in factories,smart ammunition and gas analysis.
Manufacturers are outfitting modern cars with a wide array of advanced control and sensing functions. Collision warning and avoidance systems, blind-spot monitors, lane-keep assistance, lane-departure warning and adaptive cruise control are examples of established features that assist drivers and automate certain driving tasks, making driving a safer and easier experience.
LIDAR, radar, ultrasonic sensors and cameras have their own niche sets of benefits and disadvantages. Highly or fully autonomous vehicles typically use multiple sensor technologies to create an accurate long- and short-range map of a vehicle’s surroundings under a range of weather and lighting conditions. In addition to the technologies complementing each other, it is also important to have sufficient overlap in order to increase redundancy and improve safety. Sensor fusion is the concept of using multiple sensor technologies to generate an accurate and reliable map of the environment around a vehicle.
Ultrasonic waves suffer from strong attenuation in air beyond a few meters; therefore, ultrasonic sensors are primarily used for short-range object detection.
Cameras are a cost-efficient and easily available sensor; however, they require significant processing to extract useful information and depend strongly on ambient light conditions. Cameras are unique in that they are the only technology that can “see color.” Cars that have the lane-keep assist feature use cameras to achieve this feat.
LIDAR and radar share a broad array of common and complementary features that can map surroundings as well as measure object velocity. Let’s compare the two technologies in several categories: