Robotics

Using Generative AI to Enable Robots to Reason and Act with ReMEmbR

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Vision-language models (VLMs) combine the powerful language understanding of foundational LLMs with the vision capabilities of vision transformers (ViTs) by projecting text and images into the same embedding space. They can take unstructured multimodal data, reason over […]

Using Generative AI to Enable Robots to Reason and Act with ReMEmbR Read More +

“Building Meaningful Products Using Complex Sensor Systems,” a Presentation from DEKA Research & Development

Dirk van der Merwe, Autonomous Robotics Lead at DEKA Research & Development, presents the “Building Meaningful Products Using Complex Sensor Systems” tutorial at the May 2024 Embedded Vision Summit. Most complex sensor systems begin with a simple goal—ensuring safety and efficiency. Whether it’s avoiding collisions between vehicles or predicting future… “Building Meaningful Products Using Complex

“Building Meaningful Products Using Complex Sensor Systems,” a Presentation from DEKA Research & Development Read More +

“Better Farming through Embedded AI,” a Presentation from Blue River Technology

Chris Padwick, Director of Computer Vision Machine Learning at Blue River Technology, presents the “Better Farming through Embedded AI” tutorial at the May 2024 Embedded Vision Summit. Blue River Technology, a subsidiary of John Deere, uses computer vision and deep learning to build intelligent machines that help farmers grow more… “Better Farming through Embedded AI,”

“Better Farming through Embedded AI,” a Presentation from Blue River Technology Read More +

Inuitive Demonstration of the M4.51 Depth and AI Sensor Module Based on the NU4100 Vision Processor

Shay Harel, field application engineer at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Harel demonstrates the capabilities of his company’s M4.51 sensor module using a simple Python script that leverages Inuitive’s API for real-time object detection. The M4.51 sensor module, based on the

Inuitive Demonstration of the M4.51 Depth and AI Sensor Module Based on the NU4100 Vision Processor Read More +

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm

Navdeep Dhanjal, Executive Business and Product Manager for AI microcontrollers at Analog Devices, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Dhanjal demonstrates visual servoing in a robotic arm enabled by the MAX78000 AI microcontroller. The MAX78000 is an Arm-M4F microcontroller with a hardware-based convolutional

Analog Devices Demonstration of the MAX78000 Microcontroller Enabling Edge AI in a Robotic Arm Read More +

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor

Dor Zepeniuk, CTO at Inuitive, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Zepeniuk demonstrates his company’s latest RGBD sensor, which integrates RGB color sensor with a depth sensor into a single device. The Inuitive NU4100 is an all-in-one vision processor that supports simultaneous AI-powered

Inuitive Demonstration of a RGBD Sensor Using a Synopsys ARC-based NU4100 AI and Vision Processor Read More +

Why Ethernet Cameras are Increasingly Used in Medical and Life Sciences Applications

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. In this blog, we will uncover the current medical and life sciences use cases in which Ethernet cameras are integral. The pace of technological transformations in medicine and life sciences is rapid. Imaging technologies used

Why Ethernet Cameras are Increasingly Used in Medical and Life Sciences Applications Read More +

Upcoming Webinar Explores the Role of GMSL2 Cameras in Autonomous Systems

On Thursday, July 11, 2024 at 10:00 am PT (1:00 pm ET), Alliance Member companies e-con Systems and Analog Devices will co-deliver the free webinar “The Role of GMSL2 Cameras in Autonomous Systems – How It Enables Functional Safety.” From the event page: Get expert insights on GMSL2 technology, its functional safety benefits, applications in

Upcoming Webinar Explores the Role of GMSL2 Cameras in Autonomous Systems Read More +

Basler AG Acquires Stake in Roboception GmbH

Computer vision expert Basler and 3D vision specialist Roboception strengthen existing cooperation by acquiring a 25.1% stake and plan to expand the 3D solutions business for factory automation, robotics and logistics. Ahrensburg, June 12, 2024 – Basler AG, a leading supplier of image processing components for computer vision applications, has acquired 25.1 % of the

Basler AG Acquires Stake in Roboception GmbH Read More +

Using D3 mmWave Radar Sensors with Robot Operating System (ROS)

This blog post was originally published by D3. It is reprinted here with the permission of D3. (Note: This article was written by a human being without the aid of AI, however, at D3 we use AI heavily for robotics and other applications!) Introduction Robot Operating System (ROS)is a common open source starting point for

Using D3 mmWave Radar Sensors with Robot Operating System (ROS) Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top