Jonathan Newell examines the role of sensors in the race to gaining the upper hand in autonomous transport.
The Consumer Electronic Show (CES) is held annually to showcase the latest glamorous consumer technology in the appropriately glitzy Las Vegas. At last year’s event, the consumer technology press noted with some surprise that it was becoming less of a shop-window for tech-bling and computing equipment and more of an alternative motor show.
With advances in infotainment, connected motoring, electronic automotive content and vehicle autonomy, the car may have been a means of getting from A to B in the past, but it is now the single biggest investment in consumer electronics that people are likely to make.
CES is therefore as important to the auto industry as other major motoring events and the race was on this year to be at the forefront of autonomy with all the automotive giants in attendance as well as suppliers of the associated enabling technology.
At home in Nevada
Hyundai is now in its third year of testing autonomous vehicles on public roads, having been granted a licence by the state of Nevada in 2015. Now, the Korean company is back in Las Vegas releasing its new Nexo model, which combines consumer-ready autonomous technology in the form of advanced assist systems, which provides control functions in place of the driver where required.
There is still some way go before the company can market fully autonomous EAS level 4 autonomous vehicle, but the company is confident that it can do so by 2021 with the help of autonomous control specialist, Aurora, with whom the company has recently partnered.
According to Hyundai, the fuel cell powered Nexo is in the right position in terms of consistent delivery of electrical energy to reliably power the sensors and autonomous control systems needed for full autonomy.
“The fuel-cell powertrain will offer an ideal platform to implement autonomous driving technologies, which requires a massive amount of power to support the large amount of data communication as well as the operation of hardware such as sensors,” comments Dr Woong Chul Yang, Vice Chairman of Hyundai Motor.
Blending in with the crowd
Toyota has long been proven to be one of the prime movers from the automotive industry in driverless technology and the company has already arrived at the stage of refining the design of its vehicles to blend in with other vehicles on the road rather than differentiating itself with unattractive accoutrements such as protruding antennas, sensors and 360 degree camera pods.
The result is “Lexus Platform 3.0”, a prototype vehicle that integrates automated vehicle technology with harmonized styling. The Toyota Research Institute (TRI) has been using the Lexus brand in the form of the LS 600 model for intensive driverless testing in Tokyo but CES 2018 in January was the first time the company revealed its latest platform with body-contoured sensor technology.
The objectives of TRI were to improve the perception capabilities of the car, blend the sensing equipment into the vehicle design and package the technology in a way that could be manufactured in production volumes.
According to TRI, Platform 3.0 has a very sensor-rich package that makes it one of the most perceptive automated driving test cars on the road. The Luminar LiDAR system with 200-meter range, which had only tracked the forward direction on TRI’s previous test platform, now covers the vehicle’s complete 360-degree perimeter. This is enabled by four high-resolution LiDAR scanning heads, which precisely detect objects in the environment including notoriously difficult-to-see dark objects.
Shorter-range LiDAR sensors are positioned low on all four sides of the vehicle―one in each front quarter panel and one each on the front and rear bumpers. These can detect low-level and smaller objects near the car like children and debris in the roadway.
Toyota have gone one step further by incorporating the LiDAR systems into the coachwork design to eliminate the “spinning bucket” appearance that’s always been associated with the technology.
Autonomous World Tour
Daimler used CES 2018 to punctuate the end of its “Autonomous World Tour” conducted on all five continents during the last 12 months in a modified Mercedes S-Class. Daimler described the USA as a fitting end to the tour, offering the kind of road conditions that are a significant challenge to the sensors and control systems on driverless cars.
With the need to recognizes school buses and their flashing light codes as well as road signs and lane markings that vary from state to state and different styles of high-occupancy lane identifiers, driving automation features need to have the ability to adapt to different conditions within the boundaries of a single country.
As well as major automotive manufacturers, CES 2018 was attended by companies without which the car makers would struggle to meet the demands of autonomy.
Image sensing and miniaturization were key themes for technology giants such as Intel and ON Semiconductor, both of whom were at the show to demonstrate their ability to squeeze ever-increasing perceptive power into smaller packages.
The ON Semiconductor range of CMOS image sensors has been modified to generate greater pixel density to identify finer detail combined with low-level light performance and high dynamic range to enable the sensor to cope with rapidly changing light conditions, such as the strobe effect of flashes of sunlight through an avenue of trees.
To meet the restrictive space requirements, ON Semiconductor has developed new wafer-stacking technology to reduce the package size.
The Mobileye wing of Intel has developed a System on Chip (SoC) package for use with CMOS sensors to handle the vast amounts of data travelling from the sensors to the vehicle control system, a task that needs to be performed at extremely low latency and without package loss to enable safe autonomous control – a massive challenge to the industry and one that Intel is actively engaged in with partner motor manufacturers, including BMW, Nissan and Volkswagen.
AI enabled LiDAR
One company, AEye, is taking LiDAR sensing technology a stage further with the introduction of artificial intelligence to the equation. iDAR (Intelligent Detection and Ranging) is a robotic perception system, which allows sensors to mimic the visual cortex – bringing real-time intelligence to data collection. As a result, the system not only captures everything in a scene – it actually brings higher resolution to key objects and exceeds industry required speeds and distances.
“By solving the limitations of first generation LiDAR-only products, AEye is enabling the safe, timely rollout of failsafe commercial autonomous vehicles,” says Luis Dussan, founder and CEO of AEye.