Summit 2019

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler

Thies Möller, Technical Architect at Basler, presents the “Using Blockchain to Create Trusted Embedded Vision Systems” tutorial at the May 2019 Embedded Vision Summit. In many IoT architectures, sensor data must be passed to cloud services for further processing. Traditionally, “trusted third parties” have been used to secure this data. In this talk, Möller explores […]

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler Read More +

“Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit,” a Presentation from Au-Zone Technologies

Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, presents the “Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit” tutorial at the May 2019 Embedded Vision Summit. In this presentation, Taylor describes methods and tools for developing, profiling and optimizing neural network solutions for deployment on Arm MCUs, CPUs and

“Tools and Techniques for Optimizing DNNs on Arm-based Processors with Au-Zone’s DeepView ML Toolkit,” a Presentation from Au-Zone Technologies Read More +

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies

Walter Bell, 3D Imaging Application Engineer at Infineon Technologies, presents the “REAL3 Time of Flight: A New Differentiator for Mobile Phones” tutorial at the May 2019 Embedded Vision Summit. In 2019, 3D imaging has become mainstream in mobile phone cameras. What started in 2016 with the first two smartphones using an Infineon 3D time-of-flight (ToF)

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies Read More +

“Object Detection for Embedded Markets,” a Presentation from Imagination Technologies

Paul Brasnett, PowerVR Business Development Director for Vision and AI at Imagination Technologies, presents the “Object Detection for Embedded Markets” tutorial at the May 2019 Embedded Vision Summit. While image classification was the breakthrough use case for deep learning-based computer vision, today it has a limited number of real-world applications. In contrast, object detection is

“Object Detection for Embedded Markets,” a Presentation from Imagination Technologies Read More +

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp

Oleg Sinyavskiy, Director of Research and Development at Brain Corp, presents the “Sensory Fusion for Scalable Indoor Navigation” tutorial at the May 2019 Embedded Vision Summit. Indoor autonomous navigation requires using a variety of sensors in different modalities. Merging together RGB, depth, lidar and odometry data streams to achieve autonomous operation requires a fusion of

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp Read More +

“Enabling the Next Kitchen Experience Through Embedded Vision,” a Presentation from Whirlpool

Sugosh Venkataraman, Vice President of Technology at Whirlpool, presents the “Enabling the Next Kitchen Experience Through Embedded Vision,” tutorial at the May 2019 Embedded Vision Summit. Our kitchens are the hubs where we spend quality time with family and friends, preparing and eating meals. Today, instructions for cooking a particular meal are just a few

“Enabling the Next Kitchen Experience Through Embedded Vision,” a Presentation from Whirlpool Read More +

“Applied Depth Sensing with Intel RealSense,” a Presentation from Intel

Sergey Dorodnicov, Software Architect at Intel, presents the “Applied Depth Sensing with Intel RealSense” tutorial at the May 2019 Embedded Vision Summit. As robust depth cameras become more affordable, many new products will benefit from true 3D vision. This presentation highlights the benefits of depth sensing for tasks such as autonomous navigation, collision avoidance and

“Applied Depth Sensing with Intel RealSense,” a Presentation from Intel Read More +

“A Self-service Platform to Deploy State-of-the-art Deep Learning Models in Under 30 Minutes,” a Presentation from Xnor.ai

Peter Zatloukal, VP of Engineering at Xnor.ai, presents the “A Self-service Platform to Deploy State-of-the-art Deep Learning Models in Under 30 Minutes” tutorial at the May 2019 Embedded Vision Summit. The first-of-its-kind, self-service platform described in this presentation makes it possible for software and hardware developers—even those who aren’t skilled in artificial intelligence—to deploy hyper-efficient,

“A Self-service Platform to Deploy State-of-the-art Deep Learning Models in Under 30 Minutes,” a Presentation from Xnor.ai Read More +

“Teaching Machines to See, Understand, Describe and Predict Sports Games in Real Time,” a Presentation from Sportlogiq

Mehrsan Javan, CTO of Sportlogiq, presents the “Teaching Machines to See, Understand, Describe and Predict Sports Games in Real Time” tutorial at the May 2019 Embedded Vision Summit. Sports analytics is about observing, understanding and describing the game in an intelligent manner. In practice, this means designing a fully-automated, robust, end-to-end pipeline; from visual input,

“Teaching Machines to See, Understand, Describe and Predict Sports Games in Real Time,” a Presentation from Sportlogiq Read More +

“Addressing Corner Cases in Embedded Computer Vision Applications,” a Presentation from Netradyne

David Julian, CTO and Founder of Netradyne, presents the “Addressing Corner Cases in Embedded Computer Vision Applications” tutorial at the May 2019 Embedded Vision Summit. Many embedded vision applications require solutions that are robust in the face of very diverse real-world inputs. For example, in automotive applications, vision-based safety systems may encounter unusual configurations of

“Addressing Corner Cases in Embedded Computer Vision Applications,” a Presentation from Netradyne Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top