“Addressing Tomorrow’s Sensor Fusion and Processing Needs with Cadence’s Newest Processors,” a Presentation from Cadence

Amol Borkar, Product Marketing Director at Cadence, presents the “Addressing Tomorrow’s Sensor Fusion and Processing Needs with Cadence’s Newest Processors” tutorial at the May 2024 Embedded Vision Summit.

From ADAS to autonomous vehicles to smartphones, the number and variety of sensors used in edge devices is increasing: radar, LiDAR, time-of-flight sensors and multiple cameras are more and more common. And, as sensors have improved, the data rates associated with them have also increased. Traditionally, a dedicated processor has been utilized to process data from each sensor independently. Today, however, there is a growing need for a single, unified processor capable of processing multimodal sensor data utilizing both classical and AI algorithms and implementing sensor fusion for robust perception.

In this talk, Borkar introduces the new Vision 341 DSP and Vision 331 DSP from Cadence. These cores provide a versatile single-DSP solution for various workloads, including image sensing, radar, LiDAR and AI tasks. He explores the architecture of these new processors, highlights their performance and efficiency and outlines the associated developer tools and software building blocks.

See here for a PDF of the slides.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top