TECHNOLOGIES

Analog Devices Demonstration of the MAX78000 AI Microcontroller Performing Action Recognition

Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates the MAX78000 AI microcontroller performing action recognition using a temporal convolutional network (TCN). Using a TCN-based model, the MAX78000 accurately recognizes a […]

Analog Devices Demonstration of the MAX78000 AI Microcontroller Performing Action Recognition Read More +

Free Webinar Explores Delivering High Performance and Low Power Edge AI Applications

On October 17, 2024 at 9 am PT (noon ET), SiMa.ai’s Carlos Davila, Director of Software Product Management, and Vidhyananth Venkatasamy, Principal Solutions Architect, will present the free hour webinar “Delivering High Performance, Low Power Complete Edge-AI Applications with the SiMa.ai One Platform MLSoC and Toolset,” organized by the Edge AI and Vision Alliance. Here’s

Free Webinar Explores Delivering High Performance and Low Power Edge AI Applications Read More +

Exploring the Present and Future of AI: Insights from Qualcomm’s AI Analyst and Media Workshop

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm A day with Qualcomm revealed the innovations empowering new and exciting AI experiences running within devices In the rapidly evolving world of artificial intelligence (AI), staying ahead of the curve is crucial. As a leader in on-device

Exploring the Present and Future of AI: Insights from Qualcomm’s AI Analyst and Media Workshop Read More +

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm

Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates visual servoing in a robotic arm enabled by the MAX78000 AI microcontroller. The MAX78000 is an Arm-M4F microcontroller with a hardware-based convolutional

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm Read More +

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor

Dor Zepeniuk, CTO at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Zepeniuk demonstrates his company’s latest RGBD sensor, which integrates RGB color sensor with a depth sensor into a single device. The Inuitive NU4100 is an all-in-one vision processor that supports simultaneous AI-powered

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor Read More +

Accelerating Transformer Neural Networks for Autonomous Driving

This blog post was originally published at Ambarella’s website. It is reprinted here with the permission of Ambarella. Autonomous driving (AD) and advanced driver assistance system (ADAS) providers are deploying more and more AI neural networks (NNs) to offer human-like driving experience. Several of the leading AD innovators have either deployed, or have a roadmap

Accelerating Transformer Neural Networks for Autonomous Driving Read More +

Sensor Cortek Demonstration of SmarterRoad Running on Synopsys ARC NPX6 NPU IP

Fahed Hassanhat, head of engineering at Sensor Cortek, demonstrates the company’s latest edge AI and vision technologies and products in Synopsys’ booth at the 2024 Embedded Vision Summit. Specifically, Hassanhat demonstrates his company’s latest ADAS neural network (NN) model, SmarterRoad, combining lane detection and open space detection. SmarterRoad is a light integrated convolutional network that

Sensor Cortek Demonstration of SmarterRoad Running on Synopsys ARC NPX6 NPU IP Read More +

STMicroelectronics Demonstration of RGB and Depth Fusion with a 0.5Mpixel Indirect ToF Sensor

Phillippe Legeard, Imaging Applications and Customer Support Engineer at STMicroelectronics, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Legeard demonstrates his company’s first 0.5 Mpixel indirect 3D time-of-flight (ToF) sensor, the VD55H1. The low noise, low power VD55H1 die is manufactured on advanced backside-illuminated, stacked

STMicroelectronics Demonstration of RGB and Depth Fusion with a 0.5Mpixel Indirect ToF Sensor Read More +

Annual Computer Vision and Perceptual AI Developer Survey Now Open

Every year we survey developers to understand their requirements and pain points around computer vision and perceptual AI. This survey is now in its 11th year because of people like you, who contribute their real-world insights. We share the results from the survey at Alliance events and in white papers and presentations made available throughout

Annual Computer Vision and Perceptual AI Developer Survey Now Open Read More +

Build VLM-powered Visual AI Agents Using NVIDIA NIM and NVIDIA VIA Microservices

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Traditional video analytics applications and their development workflow are typically built on fixed-function, limited models that are designed to detect and identify only a select set of predefined objects. With generative AI, NVIDIA NIM microservices, and foundation

Build VLM-powered Visual AI Agents Using NVIDIA NIM and NVIDIA VIA Microservices Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top