Object Identification

“Designing a Vision-based, Solar-powered Rear Collision Warning System,” a Presentation from Pearl Automation

Aman Sikka, Vision System Architect at Pearl Automation, presents the "Designing a Vision-based, Solar-powered Rear Collision Warning System" tutorial at the May 2017 Embedded Vision Summit. Bringing vision algorithms into mass production requires carefully balancing trade-offs between accuracy, performance, usability, and system resources. In this talk, Sikka describes the vision algorithms along with the system […]

“Designing a Vision-based, Solar-powered Rear Collision Warning System,” a Presentation from Pearl Automation Read More +

“Designing a Stereo IP Camera From Scratch,” a Presentation from ELVEES

Anton Leontiev, Embedded Software Architect at ELVEES, JSC, presents the "Designing a Stereo IP Camera From Scratch" tutorial at the May 2017 Embedded Vision Summit. As the number of cameras in an intelligent video surveillance system increases, server processing of the video quickly becomes a bottleneck. On the other hand, when computer vision algorithms are

“Designing a Stereo IP Camera From Scratch,” a Presentation from ELVEES Read More +

“Vision Challenges in a Robotic Power Tool,” a Presentation from Shaper Tools

Alec Rivers, co-founder of Shaper Tools, presents the "Vision Challenges in a Robotic Power Tool" tutorial at the May 2017 Embedded Vision Summit. Shaper Tools has developed a first-of-its-kind robotic power tool enabled by embedded vision. Vision is used to track the tool's orientation in 3D at 100 Hz to an accuracy of 0.01 inches

“Vision Challenges in a Robotic Power Tool,” a Presentation from Shaper Tools Read More +

“Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring,” a Presentation from Camio

Carter Maslan, CEO of Camio, presents the "Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring" tutorial at the May 2017 Embedded Vision Summit. Network cameras and other edge devices are collecting ever-more video – far more than can be economically transported to the cloud. This argues for putting intelligence in edge devices.

“Blending Cloud and Edge Machine Learning to Deliver Real-time Video Monitoring,” a Presentation from Camio Read More +

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks

Avinash Nehemiah, Product Marketing Manager for Computer Vision at MathWorks, presents the "How to Test and Validate an Automated Driving System" tutorial at the May 2017 Embedded Vision Summit. Have you ever wondered how ADAS and autonomous driving systems are tested? Automated driving systems combine a diverse set of technologies and engineering skill sets from

“How to Test and Validate an Automated Driving System,” a Presentation from MathWorks Read More +

“PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems,” a Presentation from XIMEA

Max Larin, CEO of XIMEA, presents the "PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Larin provides an overview of existing camera interfaces for embedded systems and explores their strengths and weaknesses.  He also examines the differences between integration of a sensor

“PCI Express – A High-bandwidth Interface for Multi-camera Embedded Systems,” a Presentation from XIMEA Read More +

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec

Olaf Munkelt, Co-founder and Managing Director at MVTec Software GmbH, presents the "Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Munkelt demonstrates how easy it is to develop an embedded vision (identification) application based on the HALCON Embedded standard software

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec Read More +

“Using Satellites to Extract Insights on the Ground,” a Presentation from Orbital Insight

Boris Babenko, Senior Software Engineer at Orbital Insight, presents the "Using Satellites to Extract Insights on the Ground" tutorial at the May 2017 Embedded Vision Summit. Satellites are great for seeing the world at scale, but analyzing petabytes of images can be extremely time-consuming for humans alone. This is why machine vision is a perfect

“Using Satellites to Extract Insights on the Ground,” a Presentation from Orbital Insight Read More +

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top