Development Tools

Development Tools for Embedded Vision

ENCOMPASSING MOST OF THE STANDARD ARSENAL USED FOR DEVELOPING REAL-TIME EMBEDDED PROCESSOR SYSTEMS

The software tools (compilers, debuggers, operating systems, libraries, etc.) encompass most of the standard arsenal used for developing real-time embedded processor systems, while adding in specialized vision libraries and possibly vendor-specific development tools for software development. On the hardware side, the requirements will depend on the application space, since the designer may need equipment for monitoring and testing real-time video data. Most of these hardware development tools are already used for other types of video system design.

Both general-purpose and vender-specific tools

Many vendors of vision devices use integrated CPUs that are based on the same instruction set (ARM, x86, etc), allowing a common set of development tools for software development. However, even though the base instruction set is the same, each CPU vendor integrates a different set of peripherals that have unique software interface requirements. In addition, most vendors accelerate the CPU with specialized computing devices (GPUs, DSPs, FPGAs, etc.) This extended CPU programming model requires a customized version of standard development tools. Most CPU vendors develop their own optimized software tool chain, while also working with 3rd-party software tool suppliers to make sure that the CPU components are broadly supported.

Heterogeneous software development in an integrated development environment

Since vision applications often require a mix of processing architectures, the development tools become more complicated and must handle multiple instruction sets and additional system debugging challenges. Most vendors provide a suite of tools that integrate development tasks into a single interface for the developer, simplifying software development and testing.

“Meeting the Critical Needs of Accuracy, Performance and Adaptability in Embedded Neural Networks,” a Presentation from Quadric

Aman Sikka, Chief Architect at Quadric, presents the “Meeting the Critical Needs of Accuracy, Performance and Adaptability in Embedded Neural Networks” tutorial at the May 2024 Embedded Vision Summit. In this presentation, Sikka explores the challenges of accuracy and performance when implementing quantized machine learning inference algorithms on embedded systems.… “Meeting the Critical Needs of

Read More »

STMicroelectronics Reveals ST BrightSense Image Sensor Ecosystem for Advanced Camera Performance Everywhere

Enables quicker and smarter designs of compact power-efficient products for factory automation, robotics, AR/VR, and medical applications Geneva, Switzerland, July 3, 2024 – STMicroelectronics has introduced a set of plug-and-play hardware kits, evaluation camera modules and software that ease development with its ST BrightSense global-shutter image sensors. The ecosystem lets developers of mass-market industrial and

Read More »

“Build a Tiny Vision Application in Minutes with the Edge App SDK,” a Presentation from Midokura, a Sony Group Company

Dan Mihai Dumitriu, Chief Technology Officer at Midokura, a Sony Group company, presents the “Build a Tiny Vision Application in Minutes with the Edge App SDK” tutorial at the May 2024 Embedded Vision Summit. In the fast-paced world of embedded vision applications, moving rapidly from concept to deployment is crucial.… “Build a Tiny Vision Application

Read More »

Generate Traffic Insights Using YOLOv8 and NVIDIA JetPack 6.0

This article was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. Intelligent Transportation Systems (ITS) applications are becoming increasingly valuable and prevalent in modern urban environments. The benefits of using ITS applications include: Increasing traffic efficiency: By analyzing real-time traffic data, ITS can optimize traffic flow, reducing congestion and

Read More »

Can Ceva Ignite the Yet-to-explode TinyML Market?

The IoT market is yet to see an “explosive growth” in TinyML. Is that due to inadequate hardware, ever-shifting software or not enough ML skills in the embedded community? What’s at stake: TinyML in embedded systems can be implemented many ways, often by leveraging beefed-up MCUs, DSPs, AI accelerators and Neural Processing Units (NPUs). The

Read More »

“Intel’s Approach to Operationalizing AI in the Manufacturing Sector,” a Presentation from Intel

Tara Thimmanaik, AI Systems and Solutions Architect at Intel, presents the “Intel’s Approach to Operationalizing AI in the Manufacturing Sector,” tutorial at the May 2024 Embedded Vision Summit. AI at the edge is powering a revolution in industrial IoT, from real-time processing and analytics that drive greater efficiency and learning… “Intel’s Approach to Operationalizing AI

Read More »

The Next Frontier in Education: How Generative AI and XR will Evolve the World of Learning in the Next Decade

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm. (Ai)Daptive XR empowers students through real-time personalization and collaborative learning Envisioning the future of education, and the art of learning overall, is nothing new. Over 120 years ago, French artist Jean-Marc Côté suggested how learning may look

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top