Tools

“Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embedded Vision Systems,” a Presentation from Sensor Platforms

Kevin Shaw, Chief Technology Officer at Sensor Platforms, delivers the presentation "Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embedded Vision Systems" at the May 2014 Embedded Vision Alliance Member Meeting.

“Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embedded Vision Systems,” a Presentation from Sensor Platforms Read More +

“How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things,” a Presentation from AugmentedReality.org

Ori Inbar, co-founder and CEO of AugmentedReality.org, presents the "How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things" tutorial at the May 2014 Embedded Vision Summit. In this talk, Inbar explains how augmented reality, which relies heavily on embedded vision, is transitioning from a

“How to Make the World More Interactive: Augmented Reality as the Interface Between Wearable Tech and the Internet of Things,” a Presentation from AugmentedReality.org Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Implementing Histogram of Oriented Gradients on a Parallel Vision Processor,” Marco Jacobs, videantis

Marco Jacobs, Vice President of Marketing at videantis, presents the "Implementing Histogram of Oriented Gradients on a Parallel Vision Processor" tutorial at the May 2014 Embedded Vision Summit. Object detection in images is one of the core problems in computer vision. The Histogram of Oriented Gradients method (Dalal and Triggs 2005) is a key algorithm

May 2014 Embedded Vision Summit Technical Presentation: “Implementing Histogram of Oriented Gradients on a Parallel Vision Processor,” Marco Jacobs, videantis Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Combining Flexibility and Low-Power in Embedded Vision Subsystems: An Application to Pedestrian Detection,” Bruno Lavigueur, Synopsys

Bruno Lavigueur, Embedded Vision Subsystem Project Leader at Synopsys, presents the "Combining Flexibility and Low-Power in Embedded Vision Subsystems: An Application to Pedestrian Detection" tutorial at the May 2014 Embedded Vision Summit. Lavigueur presents an embedded-mapping and refinement case study of a pedestrian detection application. Starting from a high-level functional description in OpenCV, he decomposes

May 2014 Embedded Vision Summit Technical Presentation: “Combining Flexibility and Low-Power in Embedded Vision Subsystems: An Application to Pedestrian Detection,” Bruno Lavigueur, Synopsys Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Computer Vision Powered by Heterogeneous System Architecture (HSA),” Harris Gasparakis, AMD

Harris Gasparakis, Ph.D., OpenCV manager at AMD, presents the "Computer Vision Powered by Heterogeneous System Architecture (HSA)" tutorial at the May 2014 Embedded Vision Summit. Gasparakis reviews the HSA vision and its current incarnation though OpenCL 2.0, and discusses its relevance and advantages for Computer Vision applications. HSA unifies CPU cores, GPU compute units, and

May 2014 Embedded Vision Summit Technical Presentation: “Computer Vision Powered by Heterogeneous System Architecture (HSA),” Harris Gasparakis, AMD Read More +

May 2014 Embedded Vision Summit Technical Presentation: “The OpenVX Hardware Acceleration API for Embedded Vision Applications and Libraries,” Neil Trevett, Khronos

Neil Trevett, President of Khronos and Vice President at NVIDIA, presents the "OpenVX Hardware Acceleration API for Embedded Vision Applications and Libraries" tutorial at the May 2014 Embedded Vision Summit. This presentation introduces OpenVX, a new application programming interface (API) from the Khronos Group. OpenVX enables performance and power optimized vision algorithms for use cases

May 2014 Embedded Vision Summit Technical Presentation: “The OpenVX Hardware Acceleration API for Embedded Vision Applications and Libraries,” Neil Trevett, Khronos Read More +

May 2014 Embedded Vision Summit Technical Presentation: “What’s New in Tools for Vision Application Design and Development?,” Jeff Bier, BDTI

Jeff Bier, President and co-founder of BDTI and founder of the Embedded Vision Alliance, presents the "What's New in Tools for Vision Application Design and Development?" tutorial at the May 2014 Embedded Vision Summit. Today, there's an unprecedented diversity of tools, APIs and libraries available for product creators who are designing and implementing vision applications,

May 2014 Embedded Vision Summit Technical Presentation: “What’s New in Tools for Vision Application Design and Development?,” Jeff Bier, BDTI Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Multiple Uses of Pipelined Video Pre-Processor Hardware in Vision Applications,” Rajesh Mahapatra, Analog Devices

Rajesh Mahapatra, Engineering Manager at Analog Devices, presents the "Multiple Uses of Pipelined Video Pre-Processor Hardware in Vision Applications" tutorial at the May 2014 Embedded Vision Summit. Significant resemblance and overlap exist among the pre-processing blocks of different vision applications. For instance, image gradients and edges have proven beneficial for a variety of applications, such

May 2014 Embedded Vision Summit Technical Presentation: “Multiple Uses of Pipelined Video Pre-Processor Hardware in Vision Applications,” Rajesh Mahapatra, Analog Devices Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Evolving Algorithmic Requirements for Recognition and Classification in Augmented Reality,” Simon Morris, CogniVue

Simon Morris, CEO of CogniVue, presents the "Evolving Algorithmic Requirements for Recognition and Classification in Augmented Reality" tutorial at the May 2014 Embedded Vision Summit. Augmented reality (AR) applications are based on accurately computing a camera's 6 degrees of freedom (6DOF) position in 3-dimensional space, also known as its "pose". In vision-based approaches to AR,

May 2014 Embedded Vision Summit Technical Presentation: “Evolving Algorithmic Requirements for Recognition and Classification in Augmented Reality,” Simon Morris, CogniVue Read More +

May 2014 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm

Francis MacDougall, Senior Director of Technology at Qualcomm, presents the "Vision-Based Gesture User Interfaces" tutorial at the May 2014 Embedded Vision Summit. The means by which we interact with the machines around us is undergoing a fundamental transformation. While we may still sometimes need to push buttons, touch displays and trackpads, and raise our voices,

May 2014 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top