Edge AI and Vision Insights: January 15, 2025

LETTER FROM THE EDITOR

Dear Colleague,

This is your last chance to submit a product for consideration in the 2025 Product of the Year Awards from the Edge AI and Vision Alliance. The deadline is this Friday January 17, so act now to ensure you don’t miss this once-a-year opportunity.

Award winners receive:

  • Year-round promotion: Visibility on the Edge AI and Vision Alliance website and in our newsletters.
  • Marketing assets: Receive a package of digital and physical assets, including an award ceremony video, a badge for your website, a trophy and social media materials.
  • Free Summit passes: Get two free passes to the May 2025 Embedded Vision Summit.
  • Exclusive reception: Gain access to the invitation-only Executive Networking Reception at the Summit—a great opportunity to connect directly with senior leaders!

Learn more and enter today!

Brian Dipert
Editor-In-Chief, Edge AI and Vision Alliance

RADAR FUNDAMENTALS AND TRENDS

Introduction to Modern Radar for Machine Perception

In this presentation, Robert Laganière, Professor at the University of Ottawa and CEO of Sensor Cortek, provides an introduction to radar for machine perception. Radar is a proven technology with a long history of successful development and it plays an increasingly important role in the deployment of robust perception systems. Laganière explains how radar sensors work—in particular, how radio waves are used to accomplish detection and ranging. He explains key concepts behind this technology, including Doppler effect, time-of-flight, frequency modulation and continuous waves. Finally, he explores the main advantages and disadvantages of radar for machine perception.

Future Radar Technologies and Applications

Radar has value in a wide range of industries that are embracing automation, from delivery drones to agriculture, each requiring different performance attributes. Autonomous vehicles are perhaps one of the most demanding applications when it comes to radar performance, requiring long ranges and high precision. In this talk, James Jeffs, Senior Technology Analyst at IDTechEx, explores the emerging demands on radar and discusses radar technologies being developed to meet these demands. He introduces four-dimensional detection, high resolution, long range and high dynamic range approaches, and explores their impact on cost, package size and power consumption. Finally, he highlights the performance improvements that arise from more powerful radars and the potential applications that they enable.

SYSTEM DEVELOPMENT TRADE-OFFS

Making Alexa More Ambiently Intelligent with Computer Vision

This presentation takes a behind-the-scenes look at the development and launch of adaptive content on Alexa Devices, which uses computer vision to adjust the on-screen display based on how close you are to the device. When you’re close to the device, the content on the home screen changes to provide more detail. When you’re farther away, it changes to be more easily viewed from a distance. Michael Giannangeli, Senior Manager of Product Management for Alexa Devices at Amazon, goes into detail about the “working backward” process used for product development at Amazon. He discusses the computer vision algorithms his company utilized; key trade-offs between latency, accuracy and privacy; and how Amazon came up with the right state logic and UI patterns to deliver a delightful customer experience.

Building Meaningful Products Using Complex Sensor Systems

Most complex sensor systems begin with a simple goal—ensuring safety and efficiency. Whether it’s avoiding collisions between vehicles or predicting future actions, the essence remains the same: can we control, predict and mitigate risks effectively? While stating this goal is straightforward, achieving it is usually quite challenging. Often, challenges stem from the complexity of the system, inherent limitations of sensors and algorithms and mismatched timescales between sensors and algorithms. In this talk, Dirk van der Merwe, Autonomous Robotics Lead at DEKA Research & Development, explores the key challenges in engineering complex sensor-based systems and outlines effective strategies for overcoming these challenges based on decades of real-world experience. He focuses on four critical areas: system engineering, team organization, data management and simulation. In each of these areas, he shares principles and techniques to help you navigate the challenges of complex sensor systems and deliver impactful solutions to real-world problems.

UPCOMING INDUSTRY EVENTS

Sensing In ADAS and Autonomous Vehicles: What’s Winning, and Why? – TechInsights Webinar: January 28, 2025, 9:00 am PT

Embedded Vision Summit: May 20-22, 2025, Santa Clara, California

More Events

FEATURED NEWS

EDGE AI AND VISION PRODUCT OF THE YEAR WINNER SHOWCASE

Tenyks Data-Centric CoPilot for Vision (Best Edge AI Developer Tool)

Tenyks’ Data-Centric CoPilot for Vision is the 2024 Edge AI and Vision Product of the Year Award Winner in the Edge AI Developer Tools category. The Data-Centric CoPilot for Vision platform helps computer vision teams develop production-ready models 8x faster. The platform enables machine learning (ML) teams to mine edge cases, failure patterns and annotation quality issues for more accurate, capable and robust models. In addition, it helps ML teams intelligently sub-sample datasets to increase model quality and cost efficiency. The platform supports the use of multimodal prompts to quickly compare model performance on customized training scenarios, such as pedestrians jaywalking at dusk, in order to discover blind spots and enhance reliability. ML teams can also leverage powerful search functionality to conduct data curation in hours vs. weeks. One notable feature of the platform is its multimodal Embeddings-as-a-Service (EaaS) to expertly organize, curate, and manage datasets. Another key platform feature is the streamlined cloud integration, supporting a multitude of cloud storage services and facilitating effortless access and management of large-scale datasets.

Please see here for more information on Tenyks’ Data-Centric CoPilot for Vision. The Edge AI and Vision Product of the Year Awards celebrate the innovation of the industry’s leading companies that are developing and enabling the next generation of edge AI and computer vision products. Winning a Product of the Year award recognizes a company’s leadership in edge AI and computer vision as evaluated by independent industry experts.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top