Retail

Gaze Tracking Using CogniMem Technologies’ CM1K and a Freescale i.MX53

This demonstration, which pairs a Freescale i.MX Quick Start board and CogniMem Technologies CM1K evaluation module, showcases how to use your eyes (specifically where you are looking at any particular point in time) as a mouse. Translating where a customer is looking to actions on a screen, and using gaze tracking to electronically control objects […]

Gaze Tracking Using CogniMem Technologies’ CM1K and a Freescale i.MX53 Read More +

“Keeping Brick and Mortar Relevant, A Look Inside Retail Analytics,” A Presentation from Prism Skylabs

Doug Johnston, Founder and Vice President of Technology at Prism Skylabs, delivers the presentation "Keeping Brick and Mortar Relevant: A Look Inside Prism Skylabs and Retail Analytics" at the December 2014 Embedded Vision Alliance Member Meeting. Doug explains how his firm is using vision to provide retailers with actionable intelligence based on consumer behavior.

“Keeping Brick and Mortar Relevant, A Look Inside Retail Analytics,” A Presentation from Prism Skylabs Read More +

idlogo2

Practical Computer Vision Enables Digital Signage with Audience Perception

This article was originally published at Information Display Magazine. It is reprinted here with the permission of the Society of Information Display. Signs that see and understand the actions and characteristics of individuals in front of them can deliver numerous benefits to advertisers and viewers alike.  Such capabilities were once only practical in research labs

Practical Computer Vision Enables Digital Signage with Audience Perception Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm

Francis MacDougall, Senior Director of Technology at Qualcomm, presents the "Vision-Based Gesture User Interfaces" tutorial within the "Vision Applications" technical session at the October 2013 Embedded Vision Summit East. MacDougall explains how gestures fit into the spectrum of advanced user interface options, compares and contrasts the various 2-D and 3-D technologies (vision and other) available

October 2013 Embedded Vision Summit Technical Presentation: “Vision-Based Gesture User Interfaces,” Francis MacDougall, Qualcomm Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Better Image Understanding Through Better Sensor Understanding,” Michael Tusch, Apical

Michael Tusch, Founder and CEO of Apical Imaging, presents the "Better Image Understanding Through Better Sensor Understanding" tutorial within the "Front-End Image Processing for Vision Applications" technical session at the October 2013 Embedded Vision Summit East. One of the main barriers to widespread use of embedded vision is its reliability. For example, systems which detect

October 2013 Embedded Vision Summit Technical Presentation: “Better Image Understanding Through Better Sensor Understanding,” Michael Tusch, Apical Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Designing a Multi-Core Architecture Tailored for Pedestrian Detection Algorithms,” Tom Michiels, Synopsys

Tom Michiels, R&D Manager at Synopsys, presents the "Designing a Multi-Core Architecture Tailored for Pedestrian Detection Algorithms" tutorial within the "Algorithms and Implementations" technical session at the October 2013 Embedded Vision Summit East. Pedestrian detection is an important function in a wide range of applications, including automotive safety systems, mobile applications, and industrial automation. A

October 2013 Embedded Vision Summit Technical Presentation: “Designing a Multi-Core Architecture Tailored for Pedestrian Detection Algorithms,” Tom Michiels, Synopsys Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How it Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, Texas Instruments

Goksel Dedeoglu, Embedded Vision R&D Manager at Texas Instruments, presents the "Embedded Lucas-Kanade Tracking: How it Works, How to Implement It, and How to Use It" tutorial within the "Algorithms and Implementations" technical session at the October 2013 Embedded Vision Summit East. This tutorial is intended for technical audiences interested in learning about the Lucas-Kanade

October 2013 Embedded Vision Summit Technical Presentation: “Embedded Lucas-Kanade Tracking: How it Works, How to Implement It, and How to Use It,” Goksel Dedeoglu, Texas Instruments Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Using FPGAs to Accelerate 3D Vision Processing: A System Developer’s View,” Ken Lee, VanGogh Imaging

Ken Lee, CEO of VanGogh Imaging, presents the "Using FPGAs to Accelerate 3D Vision Processing: A System Developer's View" tutorial within the "Implementing Vision Systems" technical session at the October 2013 Embedded Vision Summit East. Embedded vision system designers must consider many factors in choosing a processor. This is especially true for 3D vision systems,

October 2013 Embedded Vision Summit Technical Presentation: “Using FPGAs to Accelerate 3D Vision Processing: A System Developer’s View,” Ken Lee, VanGogh Imaging Read More +

October 2013 Embedded Vision Summit Technical Presentation: “Feature Detection: How It Works, When to Use It, and a Sample Implementation,” Marco Jacobs, videantis

Marco Jacobs, Technical Marketing Director at videantis, presents the "Feature Detection: How It Works, When to Use It, and a Sample Implementation" tutorial within the "Object and Feature Detection" technical session at the October 2013 Embedded Vision Summit East. Feature detection and tracking are key components of many computer vision applications. In this talk, Jacobs

October 2013 Embedded Vision Summit Technical Presentation: “Feature Detection: How It Works, When to Use It, and a Sample Implementation,” Marco Jacobs, videantis Read More +

“Embedding Computer Vision in Everyday Life,” a Keynote Presentation from iRobot

Mario E. Munich, Vice President of Advanced Development at iRobot, presents the "Embedding Computer Vision in Everyday Life" keynote at the October 2013 Embedded Vision Summit East. Munich speaks about adapting highly complex computer vision technologies to cost-effective consumer robotics applications. Munich currently manages iRobot's research and advanced development efforts. He was formerly the CTO

“Embedding Computer Vision in Everyday Life,” a Keynote Presentation from iRobot Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top