Where Does Radar Sit In the ADAS Architecture Centralization?

This market research report was originally published at Yole Group’s website. It is reprinted here with the permission of Yole Group.

The road to autonomy requires more performant sensors and a more centralized architecture

“The path through car autonomy is certainly ongoing, with the regular addition of new functions for car autonomy. It started in the 2010s with basic functionalities such as ACC and AEB , it is currently still in progress with the addition of functionalities such as Highway Pilot, and we expect it will continue in the next years with the addition of functions such as City Pilot, where a car can be fully autonomous in a specific area.”
Adrien Sanchez
Technology and Market Analyst, Computing and Software, Yole Intelligence (part of Yole Group)

This continuous implementation of new functions directly implies a need for more and more sensors from a wider diversity. More ADAS cameras are needed to realize an accurate detection and classification of objects all around the car, as well as lane and traffic sign detection. Radars are key for ensuring good detection of any object around the car, and LiDARs are increasingly added to enhance the precision of the positioning of detected objects and real-time mapping accuracy.

More sensors, more data, and more software have a direct consequence: more centralization

“On top of this growing number of sensors, which also tend to have higher and higher resolution, the software complexity is increasing sharply. Autonomous driving in an open world is a very difficult problem and reaching a level of security high enough to convince people to put their safety in the hands of a machine is incredibly complex.”
Pierrick Boulay
Senior Technology and Market Analyst, Photonics and Sensing Division, Yole Intelligence

As we’ve seen, this has led to the multiplication of sensors and to a growing number of software layers for more accuracy in the understanding of the environment, as well as to the introduction of some redundancy to prevent crashes due to system failure. In order to handle this growing amount of data and this pipeline complexity, the computing power required has increased dramatically. This has a direct impact on the car architecture, from a decentralized architecture with many small MCUs, to a centralized architecture with a few powerful processors in an ADAS domain controller. Centralization is the next step for sure, with the need to do sensor fusion. And with the growing number of sensors, nobody wants to conserve a model with one processor by sensors. So, the only question remaining is,

What will be the pace of the transformation?

Video creation using smartphones is at an all-time high due to the short-video craze. The emergence of TikTok, the favored social media of the younger generation, has been quickly copied by large incumbents, resulting in YouTube shorts and Facebook reels. This demand for high-quality video hardware was temporarily over-met during the out-of-Covid-19-lockdowns of 2021, and, therefore, the first 3 quarters of 2022 saw slightly less demand. We have seen even more dramatic but similar patterns with computer laptops and tablets in which cameras played a central role during remote work/school teleconferencing.

Another market that has explosive growth right now is Automotive CIS.  The Covid-19 era signaled a turning point in consumer behavior, with demand switching to Connected Autonomous Shared and Electric (CASE) vehicles loaded with semiconductor-based features. Overall, the appetite for cameras remains high, but the dominance of the weakened smartphone market translates into the deceptive -0.7% CIS growth expected for 2022.

An evolving mission, big consequences

“At present, radars are used as intelligent sensors with a processing capability to output a classified object list, though it is limited in the number of targets. This approach enables basic ADAS functionalities such as AEB and ACC to be deployed. As the use-cases are growing in complexity (think about ALC), as well as the car rating scenarios, the mission for radar sensors is evolving”.
CĂ©dric Malaquin
Team Lead Analyst, RF Activity, Power and Wireless Division, Yole Intelligence

This is no longer about providing range and velocity for a small number of objects. Radar sensors are evolving to literally perceive the scene around the car. The goal is to get a free space mapping by radar only, for obvious reasons of redundancy. With such a sensor, OEMs will have access to path-planning for any time, in any driving scenario. Centralization seems an obvious choice to bridge the gap, as it resonates with resources optimization. But it’s also a massive change in architecture, raising multiple points such as the partitioning of radar signal modulation, data processing, data transport, and even data fusion. Meanwhile, as these problems are clarified, edge processing has room for evolving beyond its current capabilities. In any case, the importance of software in radar sensing is growing and multiple industry players are positioning for either one or the other approach. It will be interesting to track how this industry evolves in the next few years.

Acronyms

  • ACC : Automatic Cruise Control
  • AEB : Advanced Emergency Breaking
  • ADAS : Advanced Driver-Assistance System
  • ALC : Automatic Lane Change

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top