Edge AI and Vision Alliance

Figure3

Mobile Photography’s Developing Image

A version of this article was originally published at EE Times' Embedded.com Design Line. It is reprinted here with the permission of EE Times. Still photos and videos traditionally taken with standalone cameras are increasingly being captured by camera-inclusive smartphones and tablets instead. And the post-capture processing that traditionally required a high-end computer and took […]

Mobile Photography’s Developing Image Read More +

EVA180x100

Embedded Vision Insights: July 15, 2014 Edition

In this edition of Embedded Vision Insights: New Website Content Android 4.4's Camera Subsystem Market Growth Predictions Embedded Vision in the News LETTER FROM THE EDITOR Dear Colleague, The steady stream of new videos on the Alliance website from the recent Embedded Vision Summit continues unabated. Newly published technical tutorials cover augmented reality for wearable

Embedded Vision Insights: July 15, 2014 Edition Read More +

“Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embedded Vision Systems,” a Presentation from Sensor Platforms

Kevin Shaw, Chief Technology Officer at Sensor Platforms, delivers the presentation "Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embedded Vision Systems" at the May 2014 Embedded Vision Alliance Member Meeting.

“Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embedded Vision Systems,” a Presentation from Sensor Platforms Read More +

May 2014 Embedded Vision Alliance Member Meeting Presentation: “Designing a Consumer Panoramic Camcorder Using Embedded Vision,” Paul Alioshin, CENTR

Paul Alioshin, Chief Technology Officer at CENTR, delivers the presentation "Designing a Consumer Panoramic Camcorder Using Embedded Vision" at the May 2014 Embedded Vision Alliance Member Meeting. Please note that the audio was intentionally muted between ~6:09 and ~8:52, to address soundtrack copyright concerns.

May 2014 Embedded Vision Alliance Member Meeting Presentation: “Designing a Consumer Panoramic Camcorder Using Embedded Vision,” Paul Alioshin, CENTR Read More +

“How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things,” a Presentation from AugmentedReality.org

Ori Inbar, co-founder and CEO of AugmentedReality.org, presents the "How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things" tutorial at the May 2014 Embedded Vision Summit. In this talk, Inbar explains how augmented reality, which relies heavily on embedded vision, is transitioning from a

“How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things,” a Presentation from AugmentedReality.org Read More +

EVA180x100

Embedded Vision Insights: July 1, 2014 Edition

In this edition of Embedded Vision Insights: Google's Project Tango Heterogeneous Mobile Platforms Surveillance Market Trends Embedded Vision in the News LETTER FROM THE EDITOR Dear Colleague, Johnny Lee, Technical Program Lead at Google, was one of the invited speakers at the Embedded Vision Alliance's May 30 Member Meeting. The video of his presentation, "Project

Embedded Vision Insights: July 1, 2014 Edition Read More +

May 2014 Embedded Vision Summit Technical Presentation: “The OpenVX Hardware Acceleration API for Embedded Vision Applications and Libraries,” Neil Trevett, Khronos

Neil Trevett, President of Khronos and Vice President at NVIDIA, presents the "OpenVX Hardware Acceleration API for Embedded Vision Applications and Libraries" tutorial at the May 2014 Embedded Vision Summit. This presentation introduces OpenVX, a new application programming interface (API) from the Khronos Group. OpenVX enables performance and power optimized vision algorithms for use cases

May 2014 Embedded Vision Summit Technical Presentation: “The OpenVX Hardware Acceleration API for Embedded Vision Applications and Libraries,” Neil Trevett, Khronos Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Multiple Uses of Pipelined Video Pre-Processor Hardware in Vision Applications,” Rajesh Mahapatra, Analog Devices

Rajesh Mahapatra, Engineering Manager at Analog Devices, presents the "Multiple Uses of Pipelined Video Pre-Processor Hardware in Vision Applications" tutorial at the May 2014 Embedded Vision Summit. Significant resemblance and overlap exist among the pre-processing blocks of different vision applications. For instance, image gradients and edges have proven beneficial for a variety of applications, such

May 2014 Embedded Vision Summit Technical Presentation: “Multiple Uses of Pipelined Video Pre-Processor Hardware in Vision Applications,” Rajesh Mahapatra, Analog Devices Read More +

“Project Tango: Integrating 3D Vision Into Smartphones,” a Presentation From Google

Johnny Lee, Technical Program Lead at Google, delivers the presentation "Google Project Tango: Integrating 3D Vision Into Smartphones," at the May 2014 Embedded Vision Alliance Member Meeting. Project Tango is an effort to harvest research in computer vision and robotics and concentrate that technology into a mobile platform. It uses vision and sensor fusion to

“Project Tango: Integrating 3D Vision Into Smartphones,” a Presentation From Google Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, PercepTonic

Goksel Dedeoglu, Ph.D., Founder and Lab Director of PercepTonic, presents the "Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It" tutorial at the May 2014 Embedded Vision Summit. This tutorial is intended for technical audiences interested in learning about the Lucas-Kanade (LK) tracker, also known as the Kanade-Lucas-Tomasi (KLT)

May 2014 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How It Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, PercepTonic Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top