Development Tools for Embedded Vision
ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS
The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.
Both general-purpose and vender-specific tools
Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.
Heterogeneous software development in an integrated development environment
Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/j-8opF_VpwU-300x169.jpg)
AI-powered Productivity: Windows on Snapdragon X Elite Welcomes Game-changing Apps
This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. A new generation of AI apps for Windows powered by the 45 TOPS NPU in Snapdragon X Elite Get ready to turbo-charge your productivity and creativity this summer, when Windows gets enhanced by artificial intelligence (AI). AI-powered
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/SikkaA_SpeakerCard-300x158.jpg)
“Meeting the Critical Needs of Accuracy, Performance and Adaptability in Embedded Neural Networks,” a Presentation from Quadric
Aman Sikka, Chief Architect at Quadric, presents the “Meeting the Critical Needs of Accuracy, Performance and Adaptability in Embedded Neural Networks” tutorial at the May 2024 Embedded Vision Summit. In this presentation, Sikka explores the challenges of accuracy and performance when implementing quantized machine learning inference algorithms on embedded systems.… “Meeting the Critical Needs of
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/N4640D-Jul-3-2024-BrightSense-global-shutter-sensors-and-ecosystem_IMAGE-300x169.jpg)
STMicroelectronics Reveals ST BrightSense Image Sensor Ecosystem for Advanced Camera Performance Everywhere
Enables quicker and smarter designs of compact power-efficient products for factory automation, robotics, AR/VR, and medical applications Geneva, Switzerland, July 3, 2024 – STMicroelectronics has introduced a set of plug-and-play hardware kits, evaluation camera modules and software that ease development with its ST BrightSense global-shutter image sensors. The ecosystem lets developers of mass-market industrial and
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/DumutriuD_SpeakerCard-300x158.jpg)
“Build a Tiny Vision Application in Minutes with the Edge App SDK,” a Presentation from Midokura, a Sony Group Company
Dan Mihai Dumitriu, Chief Technology Officer at Midokura, a Sony Group company, presents the “Build a Tiny Vision Application in Minutes with the Edge App SDK” tutorial at the May 2024 Embedded Vision Summit. In the fast-paced world of embedded vision applications, moving rapidly from concept to deployment is crucial.… “Build a Tiny Vision Application
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/traffic-heat-map-300x169.png)
Generate Traffic Insights Using YOLOv8 and NVIDIA JetPack 6.0
This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Intelligent Transportation Systems (ITS) applications are becoming increasingly valuable and prevalent in modern urban environments. The benefits of using ITS applications include: Increasing traffic efficiency: By analyzing real-time traffic data, ITS can optimize traffic flow, reducing congestion and
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/07/Ceva_2_TinyML-300x195.png)
Can Ceva Ignite the Yet-to-explode TinyML Market?
The IoT market is yet to see an “explosive growth” in TinyML. Is that due to inadequate hardware, ever-shifting software or not enough ML skills in the embedded community? What’s at stake: TinyML in embedded systems can be implemented many ways, often by leveraging beefed-up MCUs, DSPs, AI accelerators and Neural Processing Units (NPUs). The
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/06/ThimmanaikT_SpeakerCard-300x158.jpg)
“Intel’s Approach to Operationalizing AI in the Manufacturing Sector,” a Presentation from Intel
Tara Thimmanaik, AI Systems and Solutions Architect at Intel, presents the “Intel’s Approach to Operationalizing AI in the Manufacturing Sector,” tutorial at the May 2024 Embedded Vision Summit. AI at the edge is powering a revolution in industrial IoT, from real-time processing and analytics that drive greater efficiency and learning… “Intel’s Approach to Operationalizing AI
![](https://www.edge-ai-vision.com/wp-content/uploads/2024/06/the-next-frontier-in-education-gen-AI-and-XR-300x165.jpg)
The Next Frontier in Education: How Generative AI and XR will Evolve the World of Learning in the Next Decade
This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. (Ai)Daptive XR empowers students through real-time personalization and collaborative learning Envisioning the future of education, and the art of learning overall, is nothing new. Over 120 years ago, French artist Jean-Marc Côté suggested how learning may look