Software

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec

Olaf Munkelt, Co-founder and Managing Director at MVTec Software GmbH, presents the "Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library" tutorial at the May 2017 Embedded Vision Summit. In this presentation, Munkelt demonstrates how easy it is to develop an embedded vision (identification) application based on the HALCON Embedded standard software […]

“Embedded Vision Made Smart: Introduction to the HALCON Embedded Machine Vision Library,” a Presentation from MVTec Read More +

“Adventures in DIY Embedded Vision: The Can’t-miss Dartboard,” a Presentation from Mark Rober

Engineer, inventor and YouTube personality Mark Rober presents the "Adventures in DIY Embedded Vision: The Can’t-miss Dartboard" tutorial at the May 2017 Embedded Vision Summit. Can a mechanical engineer with no background in computer vision build a complex, robust, real-time computer vision system? Yes, with a little help from his friends. Rober fulfilled a three-year

“Adventures in DIY Embedded Vision: The Can’t-miss Dartboard,” a Presentation from Mark Rober Read More +

“Performing Multiple Perceptual Tasks With a Single Deep Neural Network,” a Presentation from Magic Leap

Andrew Rabinovich, Director of Deep Learning at Magic Leap, presents the "Performing Multiple Perceptual Tasks With a Single Deep Neural Network" tutorial at the May 2017 Embedded Vision Summit. As more system developers consider incorporating visual perception into smart devices such as self-driving cars, drones and wearable computers, attention is shifting toward practical formulation and

“Performing Multiple Perceptual Tasks With a Single Deep Neural Network,” a Presentation from Magic Leap Read More +

“Using Satellites to Extract Insights on the Ground,” a Presentation from Orbital Insight

Boris Babenko, Senior Software Engineer at Orbital Insight, presents the "Using Satellites to Extract Insights on the Ground" tutorial at the May 2017 Embedded Vision Summit. Satellites are great for seeing the world at scale, but analyzing petabytes of images can be extremely time-consuming for humans alone. This is why machine vision is a perfect

“Using Satellites to Extract Insights on the Ground,” a Presentation from Orbital Insight Read More +

“Designing Vision Systems for Human Operators and Workflows: A Case Study,” a Presentation from 8tree

Arun Chhabra, CEO of 8tree, presents the "Designing Vision Systems for Human Operators and Workflows: A Case Study" tutorial at the May 2017 Embedded Vision Summit. During the past several decades, manual measurement methods – using rulers and dial gauges – have been the status quo for inspecting dents, bumps, lightning strikes and corrosion blend-out

“Designing Vision Systems for Human Operators and Workflows: A Case Study,” a Presentation from 8tree Read More +

“Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography,” A Presentation from FotoNation

Petronel Bigioi, General Manager at FotoNation, presents the "Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography" tutorial at the May 2017 Embedded Vision Summit. This talk focuses on bringing intelligence to the edge to enable local devices to see and hear. It explores the power-consumption-vs.-flexibility dilemma by examining hard-coded and programmable architectures. It

“Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography,” A Presentation from FotoNation Read More +

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research

Mark Bünger, VP of Research at Lux Research, presents the "Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry" tutorial at the May 2017 Embedded Vision Summit. The auto and telecom industries have been dreaming of connected cars for twenty years, but their results have been mediocre and mixed. Now, just

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research Read More +

“Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs,” a Presentation from videantis

Marco Jacobs, VP of Marketing at videantis, presents the "Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs" tutorial at the May 2017 Embedded Vision Summit. 360-degree video systems use multiple cameras to capture a complete view of their surroundings. These systems are being adopted in cars, drones, virtual reality, and online streaming systems. At first

“Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs,” a Presentation from videantis Read More +

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google

Jay Yagnik, Head of Machine Perception Research at Google, presents the "Rapid Evolution and Future of Machine Perception" tutorial at the May 2017 Embedded Vision Summit. With the advent of deep learning, our ability to build systems that derive insights from perceptual data has increased dramatically. Perceptual data dwarfs almost all other data sources in

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google Read More +

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top