Optical Character Recognition

“Visual AI Enables Autonomous Security,” an Interview with Knightscope

William “Bill” Santana Li, Co-founder, Chairman and CEO of Knightscope, talks with Vin Ratford, Executive Director of the Embedded Vision Alliance, for the “Visual AI Enables Autonomous Security” interview at the May 2019 Embedded Vision Summit. Knightscope, a physical security technologies company based in Silicon Valley, develops and sells a line of autonomous robots that […]

“Visual AI Enables Autonomous Security,” an Interview with Knightscope Read More +

May 2019 Embedded Vision Summit Slides

The Embedded Vision Summit was held on May 20-23, 2019 in Santa Clara, California, as an educational forum for product creators interested in incorporating visual intelligence into electronic systems and software. The presentations delivered at the Summit are listed below. All of the slides from these presentations are included in… May 2019 Embedded Vision Summit

May 2019 Embedded Vision Summit Slides Read More +

May 2018 Embedded Vision Summit Slides

The Embedded Vision Summit was held on May 21-24, 2018 in Santa Clara, California, as an educational forum for product creators interested in incorporating visual intelligence into electronic systems and software. The presentations delivered at the Summit are listed below. All of the slides from these presentations are included in… May 2018 Embedded Vision Summit

May 2018 Embedded Vision Summit Slides Read More +

“Vision Challenges in a Robotic Power Tool,” a Presentation from Shaper Tools

Alec Rivers, co-founder of Shaper Tools, presents the "Vision Challenges in a Robotic Power Tool" tutorial at the May 2017 Embedded Vision Summit. Shaper Tools has developed a first-of-its-kind robotic power tool enabled by embedded vision. Vision is used to track the tool's orientation in 3D at 100 Hz to an accuracy of 0.01 inches

“Vision Challenges in a Robotic Power Tool,” a Presentation from Shaper Tools Read More +

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks

Avinash Nehemiah, Product Marketing Manager for Computer Vision at MathWorks, presents the "How to Test and Validate an Automated Driving System" tutorial at the May 2017 Embedded Vision Summit. Have you ever wondered how ADAS and autonomous driving systems are tested? Automated driving systems combine a diverse set of technologies and engineering skill sets from

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks Read More +

“PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems,” a Presentation from XIMEA

Max Larin, CEO of XIMEA, presents the "PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Larin provides an overview of existing camera interfaces for embedded systems and explores their strengths and weaknesses.  He also examines the differences between integration of a sensor

“PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems,” a Presentation from XIMEA Read More +

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec

Olaf Munkelt, Co-founder and Managing Director at MVTec Software GmbH, presents the "Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Munkelt demonstrates how easy it is to develop an embedded vision (identification) application based on the HALCON Embedded standard software

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies

Rafal Malewski, Head of the Graphics Technology Engineering Center at NXP Semiconductors, and Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, present the "Implementing an Optimized CNN Traffic Sign Recognition Solution" tutorial at the May 2017 Embedded Vision Summit. Now that the benefits of using deep neural networks for image classification are well known, the

“Implementing an Optimized CNN Traffic Sign Recognition Solution,” a Presentation from NXP Semiconductors and Au-Zone Technologies Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top