Processors

“A Physics-based Approach to Removing Shadows and Shading in Real Time,” a Presentation from Tandent Vision Science

Bruce Maxwell, Director of Research at Tandent Vision Science, presents the “A Physics-based Approach to Removing Shadows and Shading in Real Time” tutorial at the May 2018 Embedded Vision Summit. Shadows cast on ground surfaces can create false features and modify the color and appearance of real features, masking important information used by autonomous vehicles, […]

“A Physics-based Approach to Removing Shadows and Shading in Real Time,” a Presentation from Tandent Vision Science Read More +

“Generative Sensing: Reliable Recognition from Unreliable Sensor Data,” a Presentation from Arizona State University

Lina Karam, Professor and Computer Engineering Director at Arizona State University, presents the “Generative Sensing: Reliable Recognition from Unreliable Sensor Data” tutorial at the May 2018 Embedded Vision Summit. While deep neural networks (DNNs) perform on par with – or better than – humans on pristine high-resolution images, DNN performance is significantly worse than human

“Generative Sensing: Reliable Recognition from Unreliable Sensor Data,” a Presentation from Arizona State University Read More +

“Infusing Visual Understanding in Cloud and Edge Solutions Using State Of-the-Art Microsoft Algorithms,” a Presentation from Microsoft

Anirudh Koul, Senior Data Scientist, and Jin Yamamoto, Principal Program Manager, both from Microsoft, present the “Infusing Visual Understanding in Cloud and Edge Solutions Using State Of-the-Art Microsoft Algorithms” tutorial at the May 2018 Embedded Vision Summit. Microsoft offers its state-of-the-art computer vision algorithms, used internally in several products, through the Cognitive Services cloud APIs.

“Infusing Visual Understanding in Cloud and Edge Solutions Using State Of-the-Art Microsoft Algorithms,” a Presentation from Microsoft Read More +

“A New Generation of Camera Modules: A Novel Approach and Its Benefits for Embedded Systems,” a Presentation from Allied Vision Technologies

Paul Maria Zalewski, Product Line Manager at Allied Vision Technologies, presents the “A New Generation of Camera Modules: A Novel Approach and Its Benefits for Embedded Systems” tutorial at the May 2018 Embedded Vision Summit. Embedded vision systems have typically relied on low-cost image sensor modules with a MIPI CSI-2 interface. Now, machine vision camera

“A New Generation of Camera Modules: A Novel Approach and Its Benefits for Embedded Systems,” a Presentation from Allied Vision Technologies Read More +

“The OpenVX Computer Vision and Neural Network Inference Library Standard for Portable, Efficient Code,” a Presentation from AMD

Radhakrishna Giduthuri, Software Architect at Advanced Micro Devices (AMD), presents the “OpenVX Computer Vision and Neural Network Inference Library Standard for Portable, Efficient Code” tutorial at the May 2018 Embedded Vision Summit. OpenVX is an industry-standard computer vision and neural network inference API designed for efficient implementation on a variety of embedded platforms. The API

“The OpenVX Computer Vision and Neural Network Inference Library Standard for Portable, Efficient Code,” a Presentation from AMD Read More +

“High-end Multi-camera Technology, Applications and Examples,” a Presentation from XIMEA

Max Larin, CEO of XIMEA, presents the “High-end Multi-camera Technology, Applications and Examples” tutorial at the May 2018 Embedded Vision Summit. For OEMs and system integrators, many of today’s applications in VR/AR/MR, ADAS, measurement and automation require multiple coordinated high performance cameras. Current generic components are not optimized to achieve the desired traits in terms

“High-end Multi-camera Technology, Applications and Examples,” a Presentation from XIMEA Read More +

“Programmable CNN Acceleration in Under 1 Watt,” a Presentation from Lattice Semiconductor

Gordon Hands, Director of Marketing for IP and Solutions at Lattice Semiconductor, presents the "Programmable CNN Acceleration in Under 1 Watt" tutorial at the May 2018 Embedded Vision Summit. Driven by factors such as privacy concerns, limited network bandwidth and the need for low latency, system designers are increasingly interested in implementing artificial intelligence (AI)

“Programmable CNN Acceleration in Under 1 Watt,” a Presentation from Lattice Semiconductor Read More +

“Deploying CNN-based Vision Solutions on a $3 Microcontroller,” a Presentation from Au-Zone Technologies

Greg Lytle, VP of Engineering at Au-Zone Technologies, presents the “Deploying CNN-based Vision Solutions on a $3 Microcontroller” tutorial at the May 2018 Embedded Vision Summit. In this presentation, Lytle explains how his company designed, trained and deployed a CNN-based embedded vision solution on a low-cost, Cortex-M-based microcontroller (MCU). He describes the steps taken to

“Deploying CNN-based Vision Solutions on a $3 Microcontroller,” a Presentation from Au-Zone Technologies Read More +

“Machine Learning Inference In Under 5 mW with a Binarized Neural Network on an FPGA,” a Presentation from Lattice Semiconductor

Abdullah Raouf, Senior Marketing Manager at Lattice Semiconductor, presents the “Machine Learning Inference In Under 5 mW with a Binarized Neural Network on an FPGA” tutorial at the May 2018 Embedded Vision Summit. The demand for always-on intelligence is rapidly increasing in various applications. You can find cameras that are always watching for anomalies in

“Machine Learning Inference In Under 5 mW with a Binarized Neural Network on an FPGA,” a Presentation from Lattice Semiconductor Read More +

May 2018 Embedded Vision Summit Introductory Presentation (Day 1)

Jeff Bier, Founder of the Embedded Vision Alliance, welcomes attendees to the May 2018 Embedded Vision Summit on May 22, 2018 (Day 1). Bier provides an overview of the embedded vision market opportunity, challenges, solutions and trends. He also introduces the Embedded Vision Alliance and the resources it offers for both product creators and potential

May 2018 Embedded Vision Summit Introductory Presentation (Day 1) Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top