Videos

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and […]

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

Aldec Demonstration of Software/Hardware Co-Verification using Riviera-PRO

Henry Chan, Application Engineer at Aldec, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Chan demonstrates the company’s latest software/hardware co-verification technology. He demonstrates how production-level software can be used in QEMU to drive RTL hardware models simulated in Riviera-PRO. This approach promotes agile development between

Aldec Demonstration of Software/Hardware Co-Verification using Riviera-PRO Read More +

Aldec Demonstration of ADAS and Face/Eye Detection using SoC FPGAs

Farhad Fallahlalehzari, Application Engineer at Aldec, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Fallahlalehzari demonstrates the acceleration of ADAS and face/eye detection using Aldec’s embedded development/prototyping boards. He demonstrates how Aldec's TySOM embedded development boards can find use in improving the performance of embedded vision

Aldec Demonstration of ADAS and Face/Eye Detection using SoC FPGAs Read More +

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies

Michael Mangan, a member of the Product Manager Staff at Qualcomm Technologies, presents the "Computer Vision and Machine Learning at the Edge" tutorial at the May 2017 Embedded Vision Summit. Computer vision and machine learning techniques are applied to myriad use cases in smartphones today. As mobile technology expands beyond the smartphone vertical, both technologies

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies Read More +

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies

Michael Mangan, a member of the Product Manager Staff at Qualcomm Technologies, presents the "Computer Vision and Machine Learning at the Edge" tutorial at the May 2017 Embedded Vision Summit. Computer vision and machine learning techniques are applied to myriad use cases in smartphones today. As mobile technology expands beyond the smartphone vertical, both technologies

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies Read More +

“Deep Learning and CNN for Embedded Vision,” a Video from Synopsys

This video, one in a series published by Alliance member company Synopsys, explains how machines use deep learning for complex tasks for automotive ADAS, surveillance, augmented reality, and other applications. Deep learning is a mathematical way to model abstract data, and in Synopsys' opinion is quickly becoming a requirement for vision processors.

“Deep Learning and CNN for Embedded Vision,” a Video from Synopsys Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies

Rafal Malewski, Head of the Graphics Technology Engineering Center at NXP Semiconductors, and Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, present the "Implementing an Optimized CNN Traffic Sign Recognition Solution" tutorial at the May 2017 Embedded Vision Summit. Now that the benefits of using deep neural networks for image classification are well known, the

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies Read More +

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies

Rafal Malewski, Head of the Graphics Technology Engineering Center at NXP Semiconductors, and Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, present the "Implementing an Optimized CNN Traffic Sign Recognition Solution" tutorial at the May 2017 Embedded Vision Summit. Now that the benefits of using deep neural networks for image classification are well known, the

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top