Latest generation LiDAR sensors for automotive applications need robust climatic testing
With the proliferation of automotive safety systems and autonomous vehicle technology, the use of sensors is escalating and the environment into which these are being placed is significantly more hostile than in static environments where climatic parameters are more stable.
For the current generation of externally mounted automotive sensors, there’s a need for continuous, consistent performance and an ability to cope with different operational climatic conditions. An example of sensors that is particularly vulnerable is LiDAR components. Similar in function to radar, LiDAR uses laser light for detection and ranging. It is often used for collision avoidance systems, lane keeping assistance, dynamic cruise control and for object detection in fully autonomous vehicles. The technology has been refined significantly in recent years and LiDAR equipment is now smaller and more accurate than previous generations.
Two Environmental Challenges
With smaller equipment being hidden in more obscure places to prevent the vehicle aesthetics being adversely affected, LiDAR sensors are now finding their home on the front valance, adjacent to fog lamps or embedded in the front grille, all locations which expose the device to road dirt, water ingress, temperature extremes and potential shock and vibration events – the equipment hardware therefore needs to be robust.
Functionally, the sensors need to detect objects and perform range finding tasks in all light levels regardless of weather conditions so they need to operate in rain, fog or falling snow night and day. Such an operational challenge can’t be met without robust environmental testing.
Meeting the standard
Standards are being developed for LiDAR equipment in use on autonomous vehicles and working groups are examining such factors as range accuracy, precision and resolution, maximum and minimum range, detection probability, angular accuracy, precision and resolution, and reflectivity.
By way of establishing a benchmark and understanding the performance constraint of the current state of the art in LiDAR sensing, some comparative testing has been performed as part of a study by the VTT Technical Research Centre in Finland that looked at comparative performance between the main manufacturers of LiDAR products. The study focussed on the ability of the sensors to operate effectively and consistently in poor visibility.
The testing study took place before the global pandemic struck in 2019 and was split into two parts. The researchers at VTT decided to test the effects of two obscurants, fog and falling snow. Earlier tests on the effects of rainfall had shown consistent performance results with no significant reduction in the effectiveness of the sensor so the new tests focused on more difficult environmental conditions.
[Ref: “Testing and Validation of Automotive Point-Cloud Sensors in Adverse Weather Conditions” by Maria Jokela, Matti Kutila and Pasi Pyykönen of VTT Technical Research Centre, Finland. Published in Applied Sciences 2019]
Fog testing was carried out in climatic chambers in the French city of Clermont-Ferrand with repeatable fogging conditions created using natural agents, rather then being chemical based. The chambers were able to control fog particle size, visibility and rain intensity.
Calibrated target plates were set up with 90% reflectivity (white) and 5% reflectivity (black) and a benchmark performance established for each sensor at full visibility (without fog).
The sensors were tested at varying fog density levels over different target distances to gain an overall map of performance. Although each sensor had different specifications in terms of wavelength, field of view and resolution, they nonetheless performed similarly and predictably.
In most case, once a target was detected through the fog, distance measurements were accurate, according to the report but target detection, particularly for 5% reflectivity over larger distances, was challenging for all the sensors.
Turbulent Snow Testing
The researchers at VTT resorted to outdoor testing using vehicle-mounted LiDAR testing for the turbulent snow tests as they were unable to reproduce the vehicle dynamics and repeatable snow conditions in a chamber.
The tests at Sodankylä airport in the Lapland region of northern Finland involved the use of a test vehicle following another vehicle at a set distance, enveloping the front of the test car in turbulent snow. The test car itself also generated a powder snow cloud at its rear so worst case visibility was created around the car. The sensor output was analysed as a point cloud surrounding the sensor and reaching out to its expected range. In all cases, neither the powder snow cloud generated by the turbulence nor the leading vehicle was clearly detected, essentially rendering the car blind.
Seeing through obscurants
The tests performed in France and Finland using the technology available at the time didn’t conclude any effective threshold of visibility through fog or snow beyond which the sensors became ineffective.
The conclusions drawn were that turbulent snow blocked all the sensors and that increasing fog densities and target distances reduced the effectiveness of the sensors but no agreement could be reached on the conditions in which an autonomous vehicle couldn’t be driven due to obscured sensor vision.
Further work was recommended on understanding the concept of sensor fusion, in which LiDAR would be used with other technologies (such as radar) to increase confidence in object detection and ranging in adverse weather.
LiDAR manufacturers have since pushed forward the state-of-the-art and a new generation of sensors are now available, which overcome some of the problems outlined in the tests. One supplier, Ouster, has recently launched its SoC “System-on-Chip” version, the L2X with a higher sensitivity photodector and on-chip digital signal processor that is capable of counting over one trillion photons per second and delivers double the data rate of prior models.
The sensor can withstand high levels of shock, vibration, solar interference, water and dust while reliably providing high-quality data. Ouster says it can now provide even richer point cloud data that improve the sensor’s ability to detect objects through environmental obscurants such as rain, fog, dust, snow and even a wired fences, enabling confident deployment of their systems for all-weather performance.
According to Mark Frichtl, Chief Technology Officer at Ouster, the company’s sensors must be able to perform in real-world operating conditions across hundreds of use cases.
“Whether it’s a robotaxi driving on a foggy morning, an excavator operating in a dusty construction zone or a last-mile delivery robot navigating through a steam vent on an NYC sidewalk, our sensors must not only be mechanically reliable and robust, but also reliably output high-quality data,” he says.
With the new L2X chip, Ouster has the ability to reliably detect objects behind obscurants, having been designed to perform better within all types of weather conditions using new generation digital lidar.
LiDAR to Power Agnostic Autonomous Platform
The OS LiDAR sensor technology from Ouster will provide the vision for the TONY AV retrofit kit from autonomous vehicle provider, Perrone Robotics.
Ouster’s OS sensors will be integrated into a configuration of Perrone’s patented vehicle and hardware agnostic platform that can be embedded into many vehicle types for transit and transport applications. Perrone intends to use the Ouster configuration for a wide range of vehicles including the Local Motors Olli shuttle, GreenPower Motor Company AV Star, low speed electric passenger and utility vehicles, and a line of warehouse-based vehicles for logistics operations.
According to the company’s CEO, Paul Perrone, they aim to deploy the AV kit at fleet-scale for shuttles, low-speed electric vehicles, cargo vans,and large logistics trucks for municipal, government and commercial customers in the United States, Europe and Asia.
“Ouster stood out for its performance characteristics and the ability to scale for fleet-level AV deployments. Using the technology, we expect our platform will increase operational efficiency, enhance safety and improve accessibility for end-users,” he says.