Object Tracking

“Vision Challenges in a Robotic Power Tool,” a Presentation from Shaper Tools

Alec Rivers, co-founder of Shaper Tools, presents the "Vision Challenges in a Robotic Power Tool" tutorial at the May 2017 Embedded Vision Summit. Shaper Tools has developed a first-of-its-kind robotic power tool enabled by embedded vision. Vision is used to track the tool's orientation in 3D at 100 Hz to an accuracy of 0.01 inches […]

“Vision Challenges in a Robotic Power Tool,” a Presentation from Shaper Tools Read More +

“Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring,” a Presentation from Camio

Carter Maslan, CEO of Camio, presents the "Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring" tutorial at the May 2017 Embedded Vision Summit. Network cameras and other edge devices are collecting ever-more video – far more than can be economically transported to the cloud. This argues for putting intelligence in edge devices.

“Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring,” a Presentation from Camio Read More +

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks

Avinash Nehemiah, Product Marketing Manager for Computer Vision at MathWorks, presents the "How to Test and Validate an Automated Driving System" tutorial at the May 2017 Embedded Vision Summit. Have you ever wondered how ADAS and autonomous driving systems are tested? Automated driving systems combine a diverse set of technologies and engineering skill sets from

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks Read More +

“Adventures in DIY Embedded Vision: The Can’t-miss Dartboard,” a Presentation from Mark Rober

Engineer, inventor and YouTube personality Mark Rober presents the "Adventures in DIY Embedded Vision: The Can’t-miss Dartboard" tutorial at the May 2017 Embedded Vision Summit. Can a mechanical engineer with no background in computer vision build a complex, robust, real-time computer vision system? Yes, with a little help from his friends. Rober fulfilled a three-year

“Adventures in DIY Embedded Vision: The Can’t-miss Dartboard,” a Presentation from Mark Rober Read More +

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Video Cameras Without Video: Opportunities For Sensing With Embedded Vision,” a Presentation from Michael Tusch

Michael Tusch presents the "Video Cameras Without Video: Opportunities For Sensing With Embedded Vision" tutorial at the May 2017 Embedded Vision Summit. Within the next few years, network cameras will cease to be regarded primarily as image capture devices. They will instead transform into intelligent data capture nodes whose functionality will in many, but not

“Video Cameras Without Video: Opportunities For Sensing With Embedded Vision,” a Presentation from Michael Tusch Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top