Sensors and Cameras

“Machine Learning at the Edge in Smart Factories Using TI Sitara Processors,” a Presentation from Texas Instruments

Manisha Agrawal, Software Applications Engineer at Texas Instruments, presents the “Machine Learning at the Edge in Smart Factories Using TI Sitara Processors” tutorial at the May 2019 Embedded Vision Summit. Whether it’s called “Industry 4.0,” “industrial internet of things” (IIOT) or “smart factories,” a fundamental shift is underway in manufacturing: factories are becoming smarter. This […]

“Machine Learning at the Edge in Smart Factories Using TI Sitara Processors,” a Presentation from Texas Instruments Read More +

“Fundamental Security Challenges of Embedded Vision,” a Presentation from Synopsys

Mike Borza, Principal Security Technologist at Synopsys, presents the “Fundamental Security Challenges of Embedded Vision” tutorial at the May 2019 Embedded Vision Summit. As facial recognition, surveillance and smart vehicles become an accepted part of our daily lives, product and chip designers are coming to grips with the business need to secure the data that

“Fundamental Security Challenges of Embedded Vision,” a Presentation from Synopsys Read More +

“Introduction to Optics for Embedded Vision,” a Presentation from Jessica Gehlhar

Jessica Gehlhar, formerly an imaging engineer at Edmund Optics, presents the “Introduction to Optics for Embedded Vision” tutorial at the May 2019 Embedded Vision Summit. This talk provides an introduction to optics for embedded vision system and algorithm developers. Gehlhar begins by presenting fundamental imaging lens specifications and quality metrics such as MTF. She explains

“Introduction to Optics for Embedded Vision,” a Presentation from Jessica Gehlhar Read More +

“Improving the Safety and Performance of Automated Vehicles Through Precision Localization,” a Presentation from VSI Labs

Phil Magney, founder of VSI Labs, presents the “Improving the Safety and Performance of Automated Vehicles Through Precision Localization” tutorial at the May 2019 Embedded Vision Summit. How does a self-driving car know where it is? Magney explains how autonomous vehicles localize themselves against their surroundings through the use of a variety of sensors along

“Improving the Safety and Performance of Automated Vehicles Through Precision Localization,” a Presentation from VSI Labs Read More +

“Distance Estimation Solutions for ADAS and Automated Driving,” a Presentation from AImotive

Gergely Debreczeni, Chief Scientist at AImotive, presents the “Distance Estimation Solutions for ADAS and Automated Driving” tutorial at the May 2019 Embedded Vision Summit. Distance estimation is at the heart of automotive driver assistance systems (ADAS) and automated driving (AD). Simply stated, safe operation of vehicles requires robust distance estimation. Many different types of sensors

“Distance Estimation Solutions for ADAS and Automated Driving,” a Presentation from AImotive Read More +

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek

Bing Yu, Senior Technical Manager and Architect at MediaTek, presents the “MediaTek’s Approach for Edge Intelligence” tutorial at the May 2019 Embedded Vision Summit. MediaTek has incorporated an AI processing unit (APU) alongside the traditional CPU and GPU in its SoC designs for the next wave of smart client devices (smartphones, cameras, appliances, cars, etc.).

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek Read More +

“DNN Challenges and Approaches for L4/L5 Autonomous Vehicles,” a Presentation from Graphcore

Tom Wilson, Vice President of Automotive at Graphcore, presents the “DNN Challenges and Approaches for L4/L5 Autonomous Vehicles” tutorial at the May 2019 Embedded Vision Summit. The industry has made great strides in development of L4/L5 autonomous vehicles, but what’s available today falls far short of expectations set as recently as two to three years

“DNN Challenges and Approaches for L4/L5 Autonomous Vehicles,” a Presentation from Graphcore Read More +

“Eye Tracking for the Future: The Eyes Have It,” a Presentation from Parallel Rules

Peter Milford, President of Parallel Rules, presents the “Eye Tracking for the Future: The Eyes Have It” tutorial at the May 2019 Embedded Vision Summit. Eye interaction technologies complement augmented and virtual reality head-mounted displays. In this presentation, Milford reviews eye tracking technology, concentrating mainly on camera-based solutions and associated system requirements. Wearable eye tracking

“Eye Tracking for the Future: The Eyes Have It,” a Presentation from Parallel Rules Read More +

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique

Ben Bodley, CEO of Teknique, presents the “Designing Your Next Vision Product Using a Systems Approach,” tutorial at the May 2019 Embedded Vision Summit. Today it’s easier than ever to create a credible demo of a new smart camera product for a specific application. But the distance from a demo to a robust product is

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique Read More +

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler

Thies Möller, Technical Architect at Basler, presents the “Using Blockchain to Create Trusted Embedded Vision Systems” tutorial at the May 2019 Embedded Vision Summit. In many IoT architectures, sensor data must be passed to cloud services for further processing. Traditionally, “trusted third parties” have been used to secure this data. In this talk, Möller explores

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top