Algorithms

October 2013 Embedded Vision Summit Technical Presentation: “Implementing Real-Time Hyperspectral Imaging,” Kalyanramu Vemishetty, National Instruments

Kalyanramu Vemishetty, Senior Systems Engineer at National Instruments, presents the "Implementing Real-Time Hyperspectral Imaging tutorial within the "Front-End Image Processing for Vision Applications" technical session at the October 2013 Embedded Vision Summit East. Hyperspectral imaging enables vision systems to use many spectral bands rather than just the typical red, green, blue bands. This can be […]

October 2013 Embedded Vision Summit Technical Presentation: “Implementing Real-Time Hyperspectral Imaging,” Kalyanramu Vemishetty, National Instruments Read More +

Stereo Vision for 3D Depth Perception

Jeff Bier, founder of the Embedded Vision Alliance, interviews Goksel Dedeoglu, Manager of Embedded Vision R&D at Texas Instruments. Beginning with a hands-on demonstration of TI's real-time stereo vision prototype on the C6678 Keystone DSP, Jeff and Goksel touch upon various trade-offs in designing a stereo depth camera: the separation between the sensors, image resolution, field-of-view, and finally,

Stereo Vision for 3D Depth Perception Read More +

Lucas-Kanade Feature Tracking

Jeff Bier, founder of the Embedded Vision Alliance, interviews Goksel Dedeoglu, Manager of Embedded Vision R&D at Texas Instruments. They begin with a hands-on demonstration of real-time Lucas-Kanade tracking using TI's Vision Library VLIB on the C6678 Keystone DSP, wherein thousands of Harris corner features are detected and tracked in 1080p HD resolution images at 15 frames per

Lucas-Kanade Feature Tracking Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Efficient Super-Resolution Algorithms and Implementation Techniques for Constrained Applications,” Ilan Yona, CEVA

Ilan Yona, Director of Imaging and Computer Vision at CEVA, presents the "Efficient Super-Resolution Algorithms and Implementation Techniques for Constrained Applications" tutorial within the "Front-End Image Processing for Vision Applications" technical session at the October 2013 Embedded Vision Summit East. Image quality is a critical challenge in many applications, including smart phones, especially when using

October 2013 Embedded Vision Summit Technical Presentation: “Efficient Super-Resolution Algorithms and Implementation Techniques for Constrained Applications,” Ilan Yona, CEVA Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Efficiently Computing Disparity Maps for Low-Cost 3D Stereo Vision,” Tom Wilson, CogniVue

Tom Wilson, Vice President of Business Development at CogniVue, presents the "Efficiently Computing Disparity Maps for Low-Cost 3D Stereo Vision" tutorial within the "Front-End Image Processing for Vision Applications" technical session at the October 2013 Embedded Vision Summit East. The ability to detect and determine the position of objects in 3D is important for many

October 2013 Embedded Vision Summit Technical Presentation: “Efficiently Computing Disparity Maps for Low-Cost 3D Stereo Vision,” Tom Wilson, CogniVue Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Using Heterogeneous Computing for Mobile and Embedded Vision,” Rick Maule, Qualcomm

Rick Maule, Senior Director of Project Management at Qualcomm, presents the "Using Heterogeneous Computing for Mobile and Embedded Vision" tutorial within the "Implementing Vision Systems" technical session at the October 2013 Embedded Vision Summit East. A single vision application typically incorporates multiple algorithms requiring many different types of computation. This diversity makes it difficult for

October 2013 Embedded Vision Summit Technical Presentation: “Using Heterogeneous Computing for Mobile and Embedded Vision,” Rick Maule, Qualcomm Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm

Francis MacDougall, Senior Director of Technology at Qualcomm, presents the "Vision-Based Gesture User Interfaces" tutorial within the "Vision Applications" technical session at the October 2013 Embedded Vision Summit East. MacDougall explains how gestures fit into the spectrum of advanced user interface options, compares and contrasts the various 2-D and 3-D technologies (vision and other) available

October 2013 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm Read More +

April 2013 Embedded Vision Summit Technical Presentation: “Heterogeneous Mobile Processing Platforms for Computer Vision Applications,” Ning Bi, Qualcomm

Ning Bi, Senior Director of Technology in the Computer Vision System team at Qualcomm Technologies, presents the “Heterogeneous Mobile Processing Platforms for Computer Vision Applications” tutorial within the “Developing Vision Software, Accelerators and Systems” technical session at the April 2013 Embedded Vision Summit. For more information about Qualcomm, please send the company an email or

April 2013 Embedded Vision Summit Technical Presentation: “Heterogeneous Mobile Processing Platforms for Computer Vision Applications,” Ning Bi, Qualcomm Read More +

Figure1b

Visual Intelligence Gives Robotic Systems Spatial Sense

This article is an expanded version of one originally published at EE Times' Embedded.com Design Line. It is reprinted here with the permission of EE Times. In order for robots to meaningfully interact with objects around them as well as move about their environments, they must be able to see and discern their surroundings. Cost-effective

Visual Intelligence Gives Robotic Systems Spatial Sense Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Better Image Understanding Through Better Sensor Understanding,” Michael Tusch, Apical

Michael Tusch, Founder and CEO of Apical Imaging, presents the "Better Image Understanding Through Better Sensor Understanding" tutorial within the "Front-End Image Processing for Vision Applications" technical session at the October 2013 Embedded Vision Summit East. One of the main barriers to widespread use of embedded vision is its reliability. For example, systems which detect

October 2013 Embedded Vision Summit Technical Presentation: “Better Image Understanding Through Better Sensor Understanding,” Michael Tusch, Apical Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top