Development Tools for Embedded Vision
ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS
The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.
Both general-purpose and vender-specific tools
Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.
Heterogeneous software development in an integrated development environment
Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/FhxNz402n2M-300x169.jpg)
BrainChip Demonstration of Neuromorphic AI in a Compact Form Factor
Todd Vierra, Vice President of Customer Engagement at BrainChip, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Vierra demonstrates inference on the edge using visual wake word and Yolo models using the Akida Edge AI Box to detect and identify people. The Akida Edge AI
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/mNkL_JADTm4-300x169.jpg)
Axelera AI Demonstration of the High Performance Achievable with the Metis AIPU
Bram Verhoef, Co-founder of Axelera AI, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Verhoef demonstrates the high computer vision performance that can be achieved with his company’s Metis AIPU. A single Metis AIPU runs at over 200 TOPs and can run computer vision inference
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/small-car-bd-300x200.jpg)
Navigating the LiDAR Revolution: Trends and Innovations Ahead
This market research report was originally published at the Yole Group’s website. It is reprinted here with the permission of the Yole Group. There are today two distinct LiDAR markets: China and the rest of the world. In China, approximately 128 car models equipped with LiDAR are expected to be released by Chinese OEMs in
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/k5zy-nv3RAk-300x169.jpg)
Axelera AI Demonstration of Computer Vision Applications Based on the Metis AIPU
Bram Verhoef, Co-founder of Axelera AI, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Verhoef gives a product line overview and demonstrates the types of computer vision applications that Axelera AI’s Metis AIPU accelerates. Axelera AI offers PCIe and M.2 add-in cards, along with full
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/N7j4_5DYxh0-300x169.jpg)
Axelera AI Demonstration of Fast and Efficient Workplace Safety with the Metis AIPU
Bram Verhoef, Co-founder of Axelera AI, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Verhoef demonstrates how his company’s Metis AIPU can accelerate computer vision applications. Axelera AI, together with its partner FogSphere, has developed a computer vision system that detects if people are wearing
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/qualcomm-ai-hub-300x169.jpg)
AI Developer Workflows, Simplified: Empowering Developers with the Qualcomm AI Hub
This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. With over 100 pre-optimized AI and generative AI models, the Qualcomm AI Hub is a developer’s gateway to superior on-device AI performance Generative AI has been evolving to run on device, in addition to the cloud. It
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/668e7a62f88fefdcbbe79fe6_red-silo-ai-main-300x169.jpg)
AMD to Acquire Silo AI to Expand Enterprise AI Solutions Globally
Europe’s largest private AI lab to accelerate the development and deployment of AMD-powered AI models and software solutions Enhances open-source AI software capabilities for efficient training and inference on AMD compute platforms SANTA CLARA, Calif. — July 10, 2024 — AMD (NASDAQ: AMD) today announced the signing of a definitive agreement to acquire Silo AI,
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/V1ZK-3D5o_M-300x169.jpg)
Network Optix Demonstration of Instantly Deployable Face Detection on OrinAGX with the Nx AI Manager
Jason Lee, Nx Business Development Manager at Network Optix, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Lee demonstrates an instantly deployable face detection AI model on NVIDIA’s OrinAGX using the Nx AI Manager, a plugin specifically designed to add AI functionality on the edge