Tools

“Architecting a Smart Home Monitoring System with Millions of Cameras,” a Presentation from Comcast

Hongcheng Wang, Senior Manager of Technical R&D at Comcast, presents the “Architecting a Smart Home Monitoring System with Millions of Cameras” tutorial at the May 2018 Embedded Vision Summit. Video monitoring is a critical capability for the smart home. With millions of cameras streaming to the cloud, efficient and scalable video analytics becomes essential. To […]

“Architecting a Smart Home Monitoring System with Millions of Cameras,” a Presentation from Comcast Read More +

“The Perspective Transform in Embedded Vision,” a Presentation from Cadence

Shrinivas Gadkari, Design Engineering Director, and Aditya Joshi, Lead Design Engineer, both of Cadence, present the “Perspective Transform in Embedded Vision” tutorial at the May 2018 Embedded Vision Summit. This presentation focuses on the perspective transform and its role in many state-of-the-art embedded vision applications like video stabilization, high dynamic range (HDR) imaging and super

“The Perspective Transform in Embedded Vision,” a Presentation from Cadence Read More +

“Utilizing Neural Networks to Validate Display Content in Mission Critical Systems,” a Presentation from VeriSilicon

Shang-Hung Lin, Vice President of Vision and Imaging Products at VeriSilicon, presents the “Utilizing Neural Networks to Validate Display Content in Mission Critical Systems” tutorial at the May 2018 Embedded Vision Summit. Mission critical display systems in aerospace, automotive and industrial markets require validation of the content presented to the user, in order to enable

“Utilizing Neural Networks to Validate Display Content in Mission Critical Systems,” a Presentation from VeriSilicon Read More +

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the “Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge” tutorial at the May 2018 Embedded Vision Summit. Regardless of the processing topology—distributed, centralized or hybrid —sensor processing in automotive is an edge compute problem. However, with

“The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge,” a Presentation from NXP Semiconductors Read More +

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux

Felix Heide, CTO and Co-founder of Algolux, presents the “Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020” tutorial at the May 2018 Embedded Vision Summit. ADAS and autonomous driving systems rely on sophisticated sensor, image processing and neural-network based perception technologies. This has resulted in effective driver assistance capabilities and

“Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020,” a Presentation from Algolux Read More +

“Rapid Development of Efficient Vision Applications Using the Halide Language and CEVA Processors,” a Presentation from CEVA and mPerpetuo

Yair Siegel, Director of Business Development at CEVA, and Gary Gitelson, VP of Engineering at mPerpetuo, presents the “Rapid Development of Efficient Vision Applications Using the Halide Language and CEVA Processors” tutorial at the May 2018 Embedded Vision Summit. Halide is a domain-specific programming language for imaging and vision applications that has been adopted by

“Rapid Development of Efficient Vision Applications Using the Halide Language and CEVA Processors,” a Presentation from CEVA and mPerpetuo Read More +

“Enabling Cross-platform Deep Learning Applications with the Intel CV SDK,” a Presentation from Intel

Yury Gorbachev, Principal Engineer and the Lead Architect for the Computer Vision SDK at Intel, presents the “Enabling Cross-platform Deep Learning Applications with the Intel CV SDK” tutorial at the May 2018 Embedded Vision Summit. Intel offers a wide array of processors for computer vision and deep learning at the edge, including CPUs, GPUs, VPUs

“Enabling Cross-platform Deep Learning Applications with the Intel CV SDK,” a Presentation from Intel Read More +

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch

Markus Tremmel, Chief Expert for ADAS at Bosch, presents the “Computer Vision Hardware Acceleration for Driver Assistance” tutorial at the May 2018 Embedded Vision Summit. With highly automated and fully automated driver assistance system just around the corner, next generation ADAS sensors and central ECUs will have much higher safety and functional requirements to cope

“Computer Vision Hardware Acceleration for Driver Assistance,” a Presentation from Bosch Read More +

Computer Vision for Augmented Reality in Embedded Designs

Augmented reality (AR) and related technologies and products are becoming increasingly popular and prevalent, led by their adoption in smartphones, tablets and other mobile computing and communications devices. While developers of more deeply embedded platforms are also motivated to incorporate AR capabilities in their products, the comparative scarcity of processing, memory, storage, and networking resources

Computer Vision for Augmented Reality in Embedded Designs Read More +

“The Roomba 980: Computer Vision Meets Consumer Robotics,” a Presentation from iRobot

Mario Munich, Senior Vice President of Technology at iRobot, presents the “Roomba 980: Computer Vision Meets Consumer Robotics” tutorial at the May 2018 Embedded Vision Summit. In 2015, iRobot launched the Roomba 980, introducing intelligent visual navigation to its successful line of vacuum cleaning robots. The availability of affordable electro-mechanical components, powerful embedded microprocessors and

“The Roomba 980: Computer Vision Meets Consumer Robotics,” a Presentation from iRobot Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top