Sensors and Cameras

“Introduction to Optics for Embedded Vision,” a Presentation from Jessica Gehlhar

Jessica Gehlhar, formerly an imaging engineer at Edmund Optics, presents the “Introduction to Optics for Embedded Vision” tutorial at the May 2019 Embedded Vision Summit. This talk provides an introduction to optics for embedded vision system and algorithm developers. Gehlhar begins by presenting fundamental imaging lens specifications and quality metrics such as MTF. She explains […]

“Introduction to Optics for Embedded Vision,” a Presentation from Jessica Gehlhar Read More +

“Improving the Safety and Performance of Automated Vehicles Through Precision Localization,” a Presentation from VSI Labs

Phil Magney, founder of VSI Labs, presents the “Improving the Safety and Performance of Automated Vehicles Through Precision Localization” tutorial at the May 2019 Embedded Vision Summit. How does a self-driving car know where it is? Magney explains how autonomous vehicles localize themselves against their surroundings through the use of a variety of sensors along

“Improving the Safety and Performance of Automated Vehicles Through Precision Localization,” a Presentation from VSI Labs Read More +

“Distance Estimation Solutions for ADAS and Automated Driving,” a Presentation from AImotive

Gergely Debreczeni, Chief Scientist at AImotive, presents the “Distance Estimation Solutions for ADAS and Automated Driving” tutorial at the May 2019 Embedded Vision Summit. Distance estimation is at the heart of automotive driver assistance systems (ADAS) and automated driving (AD). Simply stated, safe operation of vehicles requires robust distance estimation. Many different types of sensors

“Distance Estimation Solutions for ADAS and Automated Driving,” a Presentation from AImotive Read More +

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek

Bing Yu, Senior Technical Manager and Architect at MediaTek, presents the “MediaTek’s Approach for Edge Intelligence” tutorial at the May 2019 Embedded Vision Summit. MediaTek has incorporated an AI processing unit (APU) alongside the traditional CPU and GPU in its SoC designs for the next wave of smart client devices (smartphones, cameras, appliances, cars, etc.).

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek Read More +

“DNN Challenges and Approaches for L4/L5 Autonomous Vehicles,” a Presentation from Graphcore

Tom Wilson, Vice President of Automotive at Graphcore, presents the “DNN Challenges and Approaches for L4/L5 Autonomous Vehicles” tutorial at the May 2019 Embedded Vision Summit. The industry has made great strides in development of L4/L5 autonomous vehicles, but what’s available today falls far short of expectations set as recently as two to three years

“DNN Challenges and Approaches for L4/L5 Autonomous Vehicles,” a Presentation from Graphcore Read More +

“Eye Tracking for the Future: The Eyes Have It,” a Presentation from Parallel Rules

Peter Milford, President of Parallel Rules, presents the “Eye Tracking for the Future: The Eyes Have It” tutorial at the May 2019 Embedded Vision Summit. Eye interaction technologies complement augmented and virtual reality head-mounted displays. In this presentation, Milford reviews eye tracking technology, concentrating mainly on camera-based solutions and associated system requirements. Wearable eye tracking

“Eye Tracking for the Future: The Eyes Have It,” a Presentation from Parallel Rules Read More +

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique

Ben Bodley, CEO of Teknique, presents the “Designing Your Next Vision Product Using a Systems Approach,” tutorial at the May 2019 Embedded Vision Summit. Today it’s easier than ever to create a credible demo of a new smart camera product for a specific application. But the distance from a demo to a robust product is

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique Read More +

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler

Thies Möller, Technical Architect at Basler, presents the “Using Blockchain to Create Trusted Embedded Vision Systems” tutorial at the May 2019 Embedded Vision Summit. In many IoT architectures, sensor data must be passed to cloud services for further processing. Traditionally, “trusted third parties” have been used to secure this data. In this talk, Möller explores

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler Read More +

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies

Walter Bell, 3D Imaging Application Engineer at Infineon Technologies, presents the “REAL3 Time of Flight: A New Differentiator for Mobile Phones” tutorial at the May 2019 Embedded Vision Summit. In 2019, 3D imaging has become mainstream in mobile phone cameras. What started in 2016 with the first two smartphones using an Infineon 3D time-of-flight (ToF)

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies Read More +

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp

Oleg Sinyavskiy, Director of Research and Development at Brain Corp, presents the “Sensory Fusion for Scalable Indoor Navigation” tutorial at the May 2019 Embedded Vision Summit. Indoor autonomous navigation requires using a variety of sensors in different modalities. Merging together RGB, depth, lidar and odometry data streams to achieve autonomous operation requires a fusion of

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top