Software

“Can We Have Both Safety and Performance in AI for Autonomous Vehicles?,” a Presentation from Codeplay Software

Andrew Richards, CEO and Co-founder of Codeplay Software, presents the “Can We Have Both Safety and Performance in AI for Autonomous Vehicles?” tutorial at the May 2019 Embedded Vision Summit. The need for ensuring safety in AI subsystems within autonomous vehicles is obvious. How to achieve it is not. Standard safety engineering tools are designed […]

“Can We Have Both Safety and Performance in AI for Autonomous Vehicles?,” a Presentation from Codeplay Software Read More +

“Memory-centric Hardware Acceleration for Machine Intelligence,” a Presentation from Crossbar

Sylvain Dubois, Vice President of Business Development and Marketing at Crossbar, presents the “Memory-centric Hardware Acceleration for Machine Intelligence” tutorial at the May 2019 Embedded Vision Summit. Even the most advanced AI chip architectures suffer from performance and energy efficiency limitations caused by the memory bottleneck between computing cores and data. Most state-of-the-art CPUs, GPUs,

“Memory-centric Hardware Acceleration for Machine Intelligence,” a Presentation from Crossbar Read More +

“DNN Challenges and Approaches for L4/L5 Autonomous Vehicles,” a Presentation from Graphcore

Tom Wilson, Vice President of Automotive at Graphcore, presents the “DNN Challenges and Approaches for L4/L5 Autonomous Vehicles” tutorial at the May 2019 Embedded Vision Summit. The industry has made great strides in development of L4/L5 autonomous vehicles, but what’s available today falls far short of expectations set as recently as two to three years

“DNN Challenges and Approaches for L4/L5 Autonomous Vehicles,” a Presentation from Graphcore Read More +

“Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications,” a Presentation from Qualcomm

Robert Lay, Computer Vision and Camera Product Manager at Qualcomm, presents the “Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications” tutorial at the May 2019 Embedded Vision Summit. Advances in imaging quality and features are accelerating, thanks to hybrid approaches that combine classical computer vision and deep learning algorithms and that take advantage of

“Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications,” a Presentation from Qualcomm Read More +

“Dynamically Reconfigurable Processor Technology for Vision Processing,” a Presentation from Renesas

Yoshio Sato, Senior Product Marketing Manager in the Industrial Business Unit at Renesas, presents the “Dynamically Reconfigurable Processor Technology for Vision Processing” tutorial at the May 2019 Embedded Vision Summit. The Dynamically Reconfigurable Processing (DRP) block in the Arm Cortex-A9 based RZ/A2M MPU accelerates image processing algorithms with spatially pipelined, time-multiplexed, reconfigurable- hardware compute resources.

“Dynamically Reconfigurable Processor Technology for Vision Processing,” a Presentation from Renesas Read More +

“Eye Tracking for the Future: The Eyes Have It,” a Presentation from Parallel Rules

Peter Milford, President of Parallel Rules, presents the “Eye Tracking for the Future: The Eyes Have It” tutorial at the May 2019 Embedded Vision Summit. Eye interaction technologies complement augmented and virtual reality head-mounted displays. In this presentation, Milford reviews eye tracking technology, concentrating mainly on camera-based solutions and associated system requirements. Wearable eye tracking

“Eye Tracking for the Future: The Eyes Have It,” a Presentation from Parallel Rules Read More +

“Fundamentals of Monocular SLAM,” a Presentation from Cadence

Shrinivas Gadkari, Design Engineering Director at Cadence, presents the “Fundamentals of Monocular SLAM” tutorial at the May 2019 Embedded Vision Summit. Simultaneous Localization and Mapping (SLAM) refers to a class of algorithms that enables a device with one or more cameras and/or other sensors to create an accurate map of its surroundings, to determine the

“Fundamentals of Monocular SLAM,” a Presentation from Cadence Read More +

“Fast and Accurate RMNet: A New Neural Network for Embedded Vision,” a Presentation from Intel

Ilya Krylov, Software Engineering Manager at Intel, presents the “Fast and Accurate RMNet: A New Neural Network for Embedded Vision” tutorial at the May 2019 Embedded Vision Summit. Usually, the top places in deep learning challenges are won by huge neural networks that require massive amounts of data and computation, making them impractical for use

“Fast and Accurate RMNet: A New Neural Network for Embedded Vision,” a Presentation from Intel Read More +

“Hardware-aware Deep Neural Network Design,” a Presentation from Facebook

Peter Vajda, Research Manager at Facebook, presents the “Hardware-aware Deep Neural Network Design” tutorial at the May 2019 Embedded Vision Summit. A central problem in the deployment of deep neural networks is maximizing accuracy within the compute performance constraints of embedded devices. In this talk, Vajda discusses approaches to addressing this challenge based on automated

“Hardware-aware Deep Neural Network Design,” a Presentation from Facebook Read More +

“Training Data for Your CNN: What You Need and How to Get It,” a Presentation from Aquifi

Carlo Dal Mutto, CTO of Aquifi, presents the “Training Data for Your CNN: What You Need and How to Get It” tutorial at the May 2019 Embedded Vision Summit. A fundamental building block for AI development is the development of a proper training set to allow effective training of neural nets. Developing such a training

“Training Data for Your CNN: What You Need and How to Get It,” a Presentation from Aquifi Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top