Enabling Technologies

“Edge Inferencing—Scalability with Intel Vision Accelerator Design Cards,” a Presentation from Intel

Rama Karamsetty, Global Marketing Manager at Intel, presents the “Edge Inferencing—Scalability with Intel Vision Accelerator Design Cards” tutorial at the September 2020 Embedded Vision Summit. Are you trying to deploy AI solutions at the edge, but running into scalability challenges that are making it difficult to meet your performance, power and price targets without creating […]

“Edge Inferencing—Scalability with Intel Vision Accelerator Design Cards,” a Presentation from Intel Read More +

“Getting Efficient DNN Inference Performance: Is It Really About the TOPS?,” a Presentation from Intel

Gary Brown, Director of AI Marketing at Intel, presents the “Getting Efficient DNN Inference Performance: Is It Really About the TOPS?” tutorial at the September 2020 Embedded Vision Summit. This presentation looks at how performance is measured among deep learning inference platforms, starting with the simple peak TOPS metric, why it’s used and why it

“Getting Efficient DNN Inference Performance: Is It Really About the TOPS?,” a Presentation from Intel Read More +

“Game Changing Depth Sensing Technique Enables Simpler, More Flexible 3D Solutions,” a Presentation from Magik Eye

Takeo Miyazawa, Founder and CEO of Magik Eye, presents the “Game Changing Depth Sensing Technique Enables Simpler, More Flexible 3D Solutions” tutorial at the May 2019 Embedded Vision Summit. Magik Eye is a global team of computer vision veterans that have developed a new method to determine depth from light directly without the need to

“Game Changing Depth Sensing Technique Enables Simpler, More Flexible 3D Solutions,” a Presentation from Magik Eye Read More +

“Machine Learning at the Edge in Smart Factories Using TI Sitara Processors,” a Presentation from Texas Instruments

Manisha Agrawal, Software Applications Engineer at Texas Instruments, presents the “Machine Learning at the Edge in Smart Factories Using TI Sitara Processors” tutorial at the May 2019 Embedded Vision Summit. Whether it’s called “Industry 4.0,” “industrial internet of things” (IIOT) or “smart factories,” a fundamental shift is underway in manufacturing: factories are becoming smarter. This

“Machine Learning at the Edge in Smart Factories Using TI Sitara Processors,” a Presentation from Texas Instruments Read More +

“Using High-level Synthesis to Bridge the Gap Between Deep Learning Frameworks and Custom Hardware Accelerators,” a Presentation from Mentor

Michael Fingeroff, HLS Technologist at Mentor, presents the “Using High-level Synthesis to Bridge the Gap Between Deep Learning Frameworks and Custom Hardware Accelerators” tutorial at the May 2019 Embedded Vision Summit. Recent years have seen an explosion in machine learning/AI algorithms with a corresponding need to use custom hardware for best performance and power efficiency.

“Using High-level Synthesis to Bridge the Gap Between Deep Learning Frameworks and Custom Hardware Accelerators,” a Presentation from Mentor Read More +

“Accelerate Adoption of AI at the Edge with Easy to Use, Low-power Programmable Solutions,” a Presentation from Lattice Semiconductor

Hussein Osman, Consumer Segment Manager at Lattice Semiconductor, presents the “Accelerate Adoption of AI at the Edge with Easy to Use, Low-power Programmable Solutions” tutorial at the May 2019 Embedded Vision Summit. In this talk, Osman shows why Lattice’s low-power FPGA devices, coupled with the sensAI software stack, are a compelling solution for implementation of

“Accelerate Adoption of AI at the Edge with Easy to Use, Low-power Programmable Solutions,” a Presentation from Lattice Semiconductor Read More +

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek

Bing Yu, Senior Technical Manager and Architect at MediaTek, presents the “MediaTek’s Approach for Edge Intelligence” tutorial at the May 2019 Embedded Vision Summit. MediaTek has incorporated an AI processing unit (APU) alongside the traditional CPU and GPU in its SoC designs for the next wave of smart client devices (smartphones, cameras, appliances, cars, etc.).

“MediaTek’s Approach for Edge Intelligence,” a Presentation from MediaTek Read More +

“Memory-centric Hardware Acceleration for Machine Intelligence,” a Presentation from Crossbar

Sylvain Dubois, Vice President of Business Development and Marketing at Crossbar, presents the “Memory-centric Hardware Acceleration for Machine Intelligence” tutorial at the May 2019 Embedded Vision Summit. Even the most advanced AI chip architectures suffer from performance and energy efficiency limitations caused by the memory bottleneck between computing cores and data. Most state-of-the-art CPUs, GPUs,

“Memory-centric Hardware Acceleration for Machine Intelligence,” a Presentation from Crossbar Read More +

“Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications,” a Presentation from Qualcomm

Robert Lay, Computer Vision and Camera Product Manager at Qualcomm, presents the “Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications” tutorial at the May 2019 Embedded Vision Summit. Advances in imaging quality and features are accelerating, thanks to hybrid approaches that combine classical computer vision and deep learning algorithms and that take advantage of

“Snapdragon Hybrid Computer Vision/Deep Learning Architecture for Imaging Applications,” a Presentation from Qualcomm Read More +

“Dynamically Reconfigurable Processor Technology for Vision Processing,” a Presentation from Renesas

Yoshio Sato, Senior Product Marketing Manager in the Industrial Business Unit at Renesas, presents the “Dynamically Reconfigurable Processor Technology for Vision Processing” tutorial at the May 2019 Embedded Vision Summit. The Dynamically Reconfigurable Processing (DRP) block in the Arm Cortex-A9 based RZ/A2M MPU accelerates image processing algorithms with spatially pipelined, time-multiplexed, reconfigurable- hardware compute resources.

“Dynamically Reconfigurable Processor Technology for Vision Processing,” a Presentation from Renesas Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top