Algorithms

2024 Embedded Vision Summit Showcase: Expert Panel Discussion

Check out the expert panel discussion “Multimodal LLMs at the Edge: Are We There Yet?” at the upcoming 2024 Embedded Vision Summit, taking place May 21-23 in Santa Clara, California! The Summit is the premier conference for innovators incorporating computer vision and edge AI in products. It attracts a global audience of technology professionals from […]

2024 Embedded Vision Summit Showcase: Expert Panel Discussion Read More +

2024 Embedded Vision Summit Showcase: Qualcomm General Session Presentation

Check out the general session presentation “What’s Next in On-Device Generative AI” at the upcoming 2024 Embedded Vision Summit, taking place May 21-23 in Santa Clara, California! The generative AI era has begun! Large multimodal models are bringing the power of language understanding to machine perception, and transformer models are expanding to allow machines to

2024 Embedded Vision Summit Showcase: Qualcomm General Session Presentation Read More +

2024 Embedded Vision Summit Showcase: Network Optix General Session Presentation

Check out the general session presentation “Scaling Vision-Based Edge AI Solutions: From Prototype to Global Deployment” at the upcoming 2024 Embedded Vision Summit, taking place May 21-23 in Santa Clara, California! The Embedded Vision Summit brings together innovators in silicon, devices, software and applications and empowers them to bring computer vision and perceptual AI into

2024 Embedded Vision Summit Showcase: Network Optix General Session Presentation Read More +

Navigating the Future: How Avnet is Addressing Challenges in AMR Design

This blog post was originally published at Avnet’s website. It is reprinted here with the permission of Avnet. Autonomous mobile robots (AMRs) are revolutionizing industries such as manufacturing, logistics, agriculture, and healthcare by performing tasks that are too dangerous, tedious, or costly for humans. AMRs can navigate complex and dynamic environments, communicate with other devices

Navigating the Future: How Avnet is Addressing Challenges in AMR Design Read More +

Embedded Vision Summit® Announces Full Conference Program for Edge AI and Computer Vision Innovators, May 21-23 in Santa Clara, California

The premier event for product creators incorporating computer vision and edge AI in products and applications SANTA CLARA, Calif., April 29, 2024 /PR Newswire/ — The Edge AI and Vision Alliance, a worldwide industry partnership, today announced the full program for the 2024 Embedded Vision Summit, taking place May 21-23 at the Santa Clara Convention

Embedded Vision Summit® Announces Full Conference Program for Edge AI and Computer Vision Innovators, May 21-23 in Santa Clara, California Read More +

Unleashing the Potential for Assisted and Automated Driving Experiences Through Scalability

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. Working within an ecosystem of innovators and suppliers is paramount to addressing the challenge of building a scalable ADAS solution While the recent sentiment around fully autonomous vehicles is not overly positive, more and more vehicles on

Unleashing the Potential for Assisted and Automated Driving Experiences Through Scalability Read More +

The Building Blocks of AI: Decoding the Role and Significance of Foundation Models

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. These neural networks, trained on large volumes of data, power the applications driving the generative AI revolution. Editor’s note: This post is part of the AI Decoded series, which demystifies AI by making the technology more accessible,

The Building Blocks of AI: Decoding the Role and Significance of Foundation Models Read More +

Oriented FAST and Rotated BRIEF (ORB) Feature Detection Speeds Up Visual SLAM

This blog post was originally published at Ceva’s website. It is reprinted here with the permission of Ceva. In the realm of smart edge devices, signal processing and AI inferencing are intertwined. Sensing can require intense computation to filter out the most significant data for inferencing. Algorithms for simultaneous localization and mapping (SLAM), a type

Oriented FAST and Rotated BRIEF (ORB) Feature Detection Speeds Up Visual SLAM Read More +

Achieving a Zero-incident Vision In Your Warehouse with Dragonfly

This blog post was originally published by Onit. It is reprinted here with the permission of Onit. At Onit, we’re revolutionizing the efficiency and safety standards in warehouse environments through edge AI and computer vision. Leveraging our state-of-the-art Dragonfly and RTLS (real-time locating system) applications, we address the complex challenges inherent in chaotic and labor-intensive

Achieving a Zero-incident Vision In Your Warehouse with Dragonfly Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top