Algorithms

“How to Choose a 3D Vision Technology,” a Presentation from Carnegie Robotics

Chris Osterwood, Chief Technical Officer at Carnegie Robotics, presents the "How to Choose a 3D Vision Technology" tutorial at the May 2017 Embedded Vision Summit. Designers of autonomous vehicles, robots, and many other systems are faced with a critical challenge: Which 3D perception technology to use? There are a wide variety of sensors on the […]

“How to Choose a 3D Vision Technology,” a Presentation from Carnegie Robotics Read More +

“Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography,” A Presentation from FotoNation

Petronel Bigioi, General Manager at FotoNation, presents the "Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography" tutorial at the May 2017 Embedded Vision Summit. This talk focuses on bringing intelligence to the edge to enable local devices to see and hear. It explores the power-consumption-vs.-flexibility dilemma by examining hard-coded and programmable architectures. It

“Ultra-Efficient VPU: Low-power Deep Learning, Computer Vision and Computational Photography,” A Presentation from FotoNation Read More +

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research

Mark Bünger, VP of Research at Lux Research, presents the "Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry" tutorial at the May 2017 Embedded Vision Summit. The auto and telecom industries have been dreaming of connected cars for twenty years, but their results have been mediocre and mixed. Now, just

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research Read More +

“Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs,” a Presentation from videantis

Marco Jacobs, VP of Marketing at videantis, presents the "Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs" tutorial at the May 2017 Embedded Vision Summit. 360-degree video systems use multiple cameras to capture a complete view of their surroundings. These systems are being adopted in cars, drones, virtual reality, and online streaming systems. At first

“Computer-vision-based 360-degree Video Systems: Architectures, Algorithms and Trade-offs,” a Presentation from videantis Read More +

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google

Jay Yagnik, Head of Machine Perception Research at Google, presents the "Rapid Evolution and Future of Machine Perception" tutorial at the May 2017 Embedded Vision Summit. With the advent of deep learning, our ability to build systems that derive insights from perceptual data has increased dramatically. Perceptual data dwarfs almost all other data sources in

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google Read More +

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler

Mark Hebbel, Head of New Business Development at Basler, presents the "Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?" tutorial at the May 2017 Embedded Vision Summit. 3D digitalization of the world is becoming more important. This additional dimension of information allows more real-world perception challenges to be

“Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?,” a Presentation from Basler Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research

Evgeni Gousev, Senior Director at Qualcomm Research, presents the "Always-On Vision Becomes a Reality" tutorial at the May 2017 Embedded Vision Summit. Intelligent devices equipped with human-like senses such as always-on touch, audio and motion detection have enabled a variety of new use cases and applications, transforming the way we interact with each other and

“Always-on Vision Becomes a Reality,” a Presentation from Qualcomm Research Read More +

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies

Michael Mangan, a member of the Product Manager Staff at Qualcomm Technologies, presents the "Computer Vision and Machine Learning at the Edge" tutorial at the May 2017 Embedded Vision Summit. Computer vision and machine learning techniques are applied to myriad use cases in smartphones today. As mobile technology expands beyond the smartphone vertical, both technologies

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies Read More +

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies

Michael Mangan, a member of the Product Manager Staff at Qualcomm Technologies, presents the "Computer Vision and Machine Learning at the Edge" tutorial at the May 2017 Embedded Vision Summit. Computer vision and machine learning techniques are applied to myriad use cases in smartphones today. As mobile technology expands beyond the smartphone vertical, both technologies

“Computer Vision and Machine Learning at the Edge,” a Presentation from Qualcomm Technologies Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top