Anthony Vaughan
Humans have been driving vehicles for over a hundred years, with a less than stellar safety record. While vehicles are safer now than they have ever been, automotive accidents and fatalities have been increasing in many regions of the world. This trend can be attributed, in part, to the increasing number of distractions confronting drivers.
To continue increasing automotive safety, automobile engineers are incorporating advanced driver assistance systems (ADAS) capable of autonomously detecting and mitigating collisions before they occur. These systems have the ability to constantly and simultaneously monitor all objects in a 360-degree radius around the vehicle and take evasive action if one of those objects presents an eminent threat. Figure 1 shows a graphical representation of detected objects around a vehicle.
At best, a human driver can monitor obstacles in only one direction at a time, while periodically glancing in the rearview and side-view mirrors to view objects around the vehicle. A human driver is also only able to roughly estimate the distance and speed of objects in proximity to their vehicle. Lidar-based advanced driver assistance systems (ADAS) have the ability to use invisible laser light to not only simultaneously monitor all objects around a vehicle, but also to determine the distance and velocity of those objects with high accuracy while not getting distracted. For more about lidar technology, see the white paper, An Introduction to Automotive Lidar.
Lidar systems have been available for over a decade; however, their size, complexity and cost have precluded their use in mainstream automobiles. Not too long ago, lidar systems cost more than $50,000 and were extremely bulky, consuming most of the roof area of a vehicle. With recent advances in component integration, lidar modules are now available for under $200, making them comparably priced with other sensor technologies such as camera and radar. Lidar modules such as Figure 2 are now also small enough to be mounted discretely in many areas, including behind the windshield and in headlights and taillights, providing a sleeker design. An ADAS that integrates lidar in addition to camera and radar sensors can benefit from the strengths of all three sensor technologies.
Camera sensor technology can “see color,” which is vital in scenarios where a vehicle needs to differentiate between the color of signal lights present at traffic intersections. However, cameras struggle in environments where ambient lighting is poor, and have problems in inclement weather conditions. Lidar modules, on the other hand, provide their own illumination and perform well when there is a lack of ambient light. Recent advances in lidar processing technology enable some systems to determine a difference in the color of objects.
Frequency-modulated continuous wave (FMCW) lidar architectures have very good immunity to adverse weather conditions such as fog, rain and snow. Radar sensors also perform well in adverse weather conditions, but because of their inherent wavelength size (4mm for 77GHz) struggle to achieve the resolution necessary to resolve small features at long distances. Lidar systems use light waves with short 905nm to 1,550nm wavelengths and are capable of sensing small objects with high resolution at distances ³300m. FMCW lidar is also able to use the Doppler principle to simultaneously determine the distance and velocity of an object.
One of the largest challenges for lidar developers is how to create laser transmissions that accurately detect objects at long distances while remaining eye safe. Even with a 1,550nm laser wavelength, it is possible to transmit enough optical power to cause eye damage.
A pulsed time-of-flight lidar system must transmit a powerful but short laser pulse to achieve long-distance measurements. Most laser drivers are used to activate gallium nitride (GaN) field-effect transistors (FETs) and produce current pulses with pulse widths of several nanoseconds. TI’s LMH13000 integrated laser driver does not require an external GaN FET or large capacitor, and can drive lasers with rise and fall times of <800ps with less than 2% variation across temperature as shown in Figure 3. Shorter laser pulses can achieve up to 30% longer distance measurements.
The LMH13000 can accurately provide 50mA to 5A to a laser, while multiple LMH13000 devices used in parallel can drive lasers with currents above 5A. Since the laser driver does not require an external FET nor large capacitors, it is possible to decrease the size of the laser driver circuit by four times compared to discrete solutions. The LMH13000’s short pulse width generation and current control enable the system to meet Class 1 FDA eye safety standards.
Discrete laser driver solutions can also have vast pulse duration variations of as much as 30% over temperature, which makes ensuring eye safety challenging as the temperature of the system changes. The output current of the LMH13000 only varies by 2% over the operating temperature of the device, improving the repeatability of measurements across temperature.
A typical lidar module design includes several analog and digital subsystems. Figure 4 highlights the laser signal generation, light sensor components, analog front-end and the digital processing subsystems in the module. Eliminating external components such as GaN FETs and large capacitors from the laser driver circuit not only makes the system smaller and perform better, but also makes the implementation more cost-effective.
In an ADAS, lidar helps make vehicles autonomously detect and mitigate collisions before they occur. As ADAS performance increases and its cost and size decrease, it will be possible to create mainstream vehicles capable of driving without the need for human intervention or oversight. Devices such as the LMH13000 make it easier for designers to create next-generation lidar systems that enable vehicles to sense in any environment.
All trademarks are the property of their respective owners.