Edge AI and Vision Alliance

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research

Mark Bünger, VP of Research at Lux Research, presents the "Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry" tutorial at the May 2017 Embedded Vision Summit. The auto and telecom industries have been dreaming of connected cars for twenty years, but their results have been mediocre and mixed. Now, just […]

“Automakers at a Crossroads: How Embedded Vision and Autonomy Will Reshape the Industry,” a Presentation from Lux Research Read More +

EVA180x100

Embedded Vision Insights: August 18, 2017 Edition

LETTER FROM THE EDITOR Dear Colleague, TensorFlow has become a popular framework for creating machine learning-based computer vision applications, especially for the development of deep neural networks. If you’re planning to develop computer vision applications using deep learning and want to understand how to use TensorFlow to do it, then don’t miss an upcoming full-day,

Embedded Vision Insights: August 18, 2017 Edition Read More +

“Introduction to Optics for Embedded Vision,” a Presentation from Edmund Optics

Jessica Gehlhar, Vision Solutions Engineer at Edmund Optics, presents the “Introduction to Optics for Embedded Vision” tutorial at the May 2017 Embedded Vision Summit. This talk provides an introduction to optics for embedded vision system and algorithm developers. Gehlhar begins by presenting fundamental imaging lens specifications and quality metrics. She explains key parameters and concepts

“Introduction to Optics for Embedded Vision,” a Presentation from Edmund Optics Read More +

“What’s Hot? The M&A and Funding Landscape for Computer Vision Companies,” a Presentation from Woodside Capital Partners

Rudy Burger, Managing Partner at Woodside Capital Partners, presents the "What’s Hot? The M&A and Funding Landscape for Computer Vision Companies" tutorial at the May 2017 Embedded Vision Summit. The six primary markets driving computer vision are automotive, sports and entertainment, consumer and mobile, robotics and machine vision, medical, and security and surveillance. This presentation

“What’s Hot? The M&A and Funding Landscape for Computer Vision Companies,” a Presentation from Woodside Capital Partners Read More +

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google

Jay Yagnik, Head of Machine Perception Research at Google, presents the "Rapid Evolution and Future of Machine Perception" tutorial at the May 2017 Embedded Vision Summit. With the advent of deep learning, our ability to build systems that derive insights from perceptual data has increased dramatically. Perceptual data dwarfs almost all other data sources in

“The Rapid Evolution and Future of Machine Perception,” a Presentation from Google Read More +

EVA180x100

Embedded Vision Insights: August 1, 2017 Edition

COMPUTER VISION FOR IMAGE UNDERSTANDING Semantic Segmentation for Scene Understanding: Algorithms and Implementations Recent research in deep learning provides powerful tools that begin to address the daunting problem of automated scene understanding. Modifying deep learning methods, such as CNNs, to classify pixels in a scene with the help of the neighboring pixels has provided very

Embedded Vision Insights: August 1, 2017 Edition Read More +

EVA180x100

Embedded Vision Insights: August 1, 2017 Edition

COMPUTER VISION FOR IMAGE UNDERSTANDING Semantic Segmentation for Scene Understanding: Algorithms and Implementations Recent research in deep learning provides powerful tools that begin to address the daunting problem of automated scene understanding. Modifying deep learning methods, such as CNNs, to classify pixels in a scene with the help of the neighboring pixels has provided very

Embedded Vision Insights: August 1, 2017 Edition Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit. A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often

“Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles,” a Presentation from NXP Semiconductors Read More +

TensorFlowLogo

Building Mobile Apps with TensorFlow: An Interview with Google’s Pete Warden

Pete Warden, Google Research Engineer and technical lead on the company's mobile/embedded TensorFlow team, is a long-time advocate of the Embedded Vision Alliance. Warden has delivered presentations at both the 2016 ("TensorFlow: Enabling Mobile and Embedded Machine Intelligence") and 2017 ("Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP") Embedded Vision Summits, along with

Building Mobile Apps with TensorFlow: An Interview with Google’s Pete Warden Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top