Automotive

“Making Cars That See — Failure is Not an Option,” a Presentation from Synopsys

Burkhard Huhnke, Vice President of Automotive Strategy for Synopsys, presents the "Making Cars That See—Failure is Not an Option" tutorial at the May 2019 Embedded Vision Summit. Drivers are the biggest source of uncertainty in the operation of cars. Computer vision is helping to eliminate human error and make the roads safer. But 14 years […]

“Making Cars That See — Failure is Not an Option,” a Presentation from Synopsys Read More +

“Automotive Vision Systems — Seeing the Way Forward,” a Presentation from Strategy Analytics

Ian Riches, Executive Director for Global Automotive Practice at Strategy Analytics, presents the "Automotive Vision Systems— Seeing the Way Forward" tutorial at the May 2019 Embedded Vision Summit. It was not long ago that cameras were a rarity on all but luxury cars. In 2018, as many automotive cameras were shipped as were vehicles! Riches'

“Automotive Vision Systems — Seeing the Way Forward,” a Presentation from Strategy Analytics Read More +

“Shifts in the Automated Driving Industry,” a Presentation from AImotive

László Kishonti, CEO of AImotive, presents the "Shifts in the Automated Driving Industry" tutorial at the May 2019 Embedded Vision Summit. 2018 will have a lasting effect on the self-driving industry, as key stakeholders have turned from the unattainable goal of full autonomy by 2021 to more realistic development and productization roadmaps. This will in

“Shifts in the Automated Driving Industry,” a Presentation from AImotive Read More +

May 2019 Embedded Vision Summit Slides

The Embedded Vision Summit was held on May 20-23, 2019 in Santa Clara, California, as an educational forum for product creators interested in incorporating visual intelligence into electronic systems and software. The presentations delivered at the Summit are listed below. All of the slides from these presentations are included in… May 2019 Embedded Vision Summit

May 2019 Embedded Vision Summit Slides Read More +

Figure6

Multi-sensor Fusion for Robust Device Autonomy

While visible light image sensors may be the baseline “one sensor to rule them all” included in all autonomous system designs, they’re not necessarily a sole panacea. By combining them with other sensor technologies: “Situational awareness” sensors; standard and high-resolution radar, LiDAR, infrared and UV, ultrasound and sonar, etc., and “Positional awareness” sensors such as

Multi-sensor Fusion for Robust Device Autonomy Read More +

jhh-mercedes_600

Mercedes-Benz, NVIDIA to Create New AI Architecture for Mercedes Vehicles

NVIDIA’s Jensen Huang, Mercedes’ Sajjad Khan unveil vision for software-defined AI cars integrating self-driving, intelligent cockpits. Mercedes-Benz announced today it has selected NVIDIA to help realize its vision for next-generation vehicles. Speaking to a packed crowd at the Mercedes-Benz booth on the first day of CES 2019, Mercedes-Benz Executive Vice President Sajjad Khan and NVIDIA

Mercedes-Benz, NVIDIA to Create New AI Architecture for Mercedes Vehicles Read More +

“Understanding Automotive Radar: Present and Future,” a Presentation from NXP Semiconductors

Arunesh Roy, Radar Algorithms Architect at NXP Semiconductors, presents the “Understanding Automotive Radar: Present and Future” tutorial at the May 2018 Embedded Vision Summit. Thanks to its proven, all-weather range detection capability, radar is increasingly used for driver assistance functions such as automatic emergency braking and adaptive cruise control. Radar is considered a crucial sensing

“Understanding Automotive Radar: Present and Future,” a Presentation from NXP Semiconductors Read More +

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the “Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge” tutorial at the May 2018 Embedded Vision Summit. Regardless of the processing topology—distributed, centralized or hybrid —sensor processing in automotive is an edge compute problem. However, with

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors Read More +

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux

Felix Heide, CTO and Co-founder of Algolux, presents the “Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020” tutorial at the May 2018 Embedded Vision Summit. ADAS and autonomous driving systems rely on sophisticated sensor, image processing and neural-network based perception technologies. This has resulted in effective driver assistance capabilities and

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux Read More +

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch

Markus Tremmel, Chief Expert for ADAS at Bosch, presents the “Computer Vision Hardware Acceleration for Driver Assistance” tutorial at the May 2018 Embedded Vision Summit. With highly automated and fully automated driver assistance system just around the corner, next generation ADAS sensors and central ECUs will have much higher safety and functional requirements to cope

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top