Algorithms

“Harnessing the Edge and the Cloud Together for Visual AI,” a Presentation from Au-Zone Technologies

Sébastien Taylor, Vision Technology Architect at Au-Zone Technologies, presents the “Harnessing the Edge and the Cloud Together for Visual AI” tutorial at the May 2018 Embedded Vision Summit. Embedded developers are increasingly comfortable deploying trained neural networks as static elements in edge devices, as well as using cloud-based vision services to implement visual intelligence remotely. […]

“Harnessing the Edge and the Cloud Together for Visual AI,” a Presentation from Au-Zone Technologies Read More +

“Improving and Implementing Traditional Computer Vision Algorithms Using DNN Techniques,” a Presentation from Imagination Technologies

Paul Brasnett, Senior Research Manager for Vision and AI in the PowerVR Division at Imagination Technologies, presents the “Improving and Implementing Traditional Computer Vision Algorithms Using DNN Techniques” tutorial at the May 2018 Embedded Vision Summit. There has been a very significant shift in the computer vision industry over the past few years, from traditional

“Improving and Implementing Traditional Computer Vision Algorithms Using DNN Techniques,” a Presentation from Imagination Technologies Read More +

“Architecting a Smart Home Monitoring System with Millions of Cameras,” a Presentation from Comcast

Hongcheng Wang, Senior Manager of Technical R&D at Comcast, presents the “Architecting a Smart Home Monitoring System with Millions of Cameras” tutorial at the May 2018 Embedded Vision Summit. Video monitoring is a critical capability for the smart home. With millions of cameras streaming to the cloud, efficient and scalable video analytics becomes essential. To

“Architecting a Smart Home Monitoring System with Millions of Cameras,” a Presentation from Comcast Read More +

“The Perspective Transform in Embedded Vision,” a Presentation from Cadence

Shrinivas Gadkari, Design Engineering Director, and Aditya Joshi, Lead Design Engineer, both of Cadence, present the “Perspective Transform in Embedded Vision” tutorial at the May 2018 Embedded Vision Summit. This presentation focuses on the perspective transform and its role in many state-of-the-art embedded vision applications like video stabilization, high dynamic range (HDR) imaging and super

“The Perspective Transform in Embedded Vision,” a Presentation from Cadence Read More +

“Utilizing Neural Networks to Validate Display Content in Mission Critical Systems,” a Presentation from VeriSilicon

Shang-Hung Lin, Vice President of Vision and Imaging Products at VeriSilicon, presents the “Utilizing Neural Networks to Validate Display Content in Mission Critical Systems” tutorial at the May 2018 Embedded Vision Summit. Mission critical display systems in aerospace, automotive and industrial markets require validation of the content presented to the user, in order to enable

“Utilizing Neural Networks to Validate Display Content in Mission Critical Systems,” a Presentation from VeriSilicon Read More +

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the “Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge” tutorial at the May 2018 Embedded Vision Summit. Regardless of the processing topology—distributed, centralized or hybrid —sensor processing in automotive is an edge compute problem. However, with

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors Read More +

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux

Felix Heide, CTO and Co-founder of Algolux, presents the “Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020” tutorial at the May 2018 Embedded Vision Summit. ADAS and autonomous driving systems rely on sophisticated sensor, image processing and neural-network based perception technologies. This has resulted in effective driver assistance capabilities and

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux Read More +

“Rapid Development of Efficient Vision Applications Using the Halide Language and CEVA Processors,” a Presentation from CEVA and mPerpetuo

Yair Siegel, Director of Business Development at CEVA, and Gary Gitelson, VP of Engineering at mPerpetuo, presents the “Rapid Development of Efficient Vision Applications Using the Halide Language and CEVA Processors” tutorial at the May 2018 Embedded Vision Summit. Halide is a domain-specific programming language for imaging and vision applications that has been adopted by

“Rapid Development of Efficient Vision Applications Using the Halide Language and CEVA Processors,” a Presentation from CEVA and mPerpetuo Read More +

“Enabling Cross-platform Deep Learning Applications with the Intel CV SDK,” a Presentation from Intel

Yury Gorbachev, Principal Engineer and the Lead Architect for the Computer Vision SDK at Intel, presents the “Enabling Cross-platform Deep Learning Applications with the Intel CV SDK” tutorial at the May 2018 Embedded Vision Summit. Intel offers a wide array of processors for computer vision and deep learning at the edge, including CPUs, GPUs, VPUs

“Enabling Cross-platform Deep Learning Applications with the Intel CV SDK,” a Presentation from Intel Read More +

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch

Markus Tremmel, Chief Expert for ADAS at Bosch, presents the “Computer Vision Hardware Acceleration for Driver Assistance” tutorial at the May 2018 Embedded Vision Summit. With highly automated and fully automated driver assistance system just around the corner, next generation ADAS sensors and central ECUs will have much higher safety and functional requirements to cope

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top