Brian Dipert

Nota AI Demonstration of Revolutionizing Driver Monitoring Systems

Tae-Ho Kim, CTO and Co-founder of Nota AI, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Kim demonstrates Nota DMS, his company’s state-of-the-art driver monitoring system. The solution enhances driver safety by monitoring attention and detecting drowsiness in real-time. Cutting-edge AI techniques make Nota DMS […]

Nota AI Demonstration of Revolutionizing Driver Monitoring Systems Read More +

Nextchip Demonstration of Its Vision Professional ISP Optimization for Computer Vision

Sophie Jeon, Global Strategy Marketing Manager at Nextchip, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Jeon demonstrates her company’s expertise in optimizing ISPs for computer vision by comparing the tuning technologies used for human vision and machine vision applications.

Nextchip Demonstration of Its Vision Professional ISP Optimization for Computer Vision Read More +

Steering a Revolution: Optimized Automated Driving with Heterogeneous Compute

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm Qualcomm Technologies’ latest whitepaper navigates the advantages of Snapdragon Ride Solutions based on heterogeneous compute SoCs. As the automotive industry continues to progress toward automated driving, advanced driver assistance systems (ADAS) are in high demand. These systems

Steering a Revolution: Optimized Automated Driving with Heterogeneous Compute Read More +

Nextchip Demonstration of the APACHE5 ADAS SoC

Sophie Jeon, Global Strategy Marketing Manager at Nextchip, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Jeon demonstrates her company’s APACHE5 ADAS SoC. APACHE5 is ready for market with an accompanying SDK, and has passed all qualifications for production such as PPAP (the Production Part

Nextchip Demonstration of the APACHE5 ADAS SoC Read More +

Nextchip Demonstration of the APACHE6 ADAS SoC

Sophie Jeon, Global Strategy Marketing Manager at Nextchip, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Jeon demonstrates her company’s APACHE6 ADAS SoC. With advanced computing power, APACHE6 makes your vehicle smarter, avoiding risk while driving and parking.

Nextchip Demonstration of the APACHE6 ADAS SoC Read More +

Roboshuttles: A Promising Yet Challenging Mobility Solution

Roboshuttles are small, fully electric, and operate at Level 4 autonomy, making them an ideal last-mile solution. They were once highly anticipated in the autonomous driving industry as a promising mobility solution, and at one point, over 25 companies were competing in this space. However, IDTechEx has observed a yearly decline in the number of

Roboshuttles: A Promising Yet Challenging Mobility Solution Read More +

Top Camera Features that Empower Smart Traffic Management Systems

This blog post was originally published at e-con Systems’ website. It is reprinted here with the permission of e-con Systems. Traffic systems leverage camera solutions to empower smart cities to handle major traffic challenges. Some of their capabilities include real-time monitoring, incident detection, and law enforcement. Discover the camera’s role in these systems and the

Top Camera Features that Empower Smart Traffic Management Systems Read More +

Lattice Semiconductor Demonstration of a Low-latency Edge AI Sensor Bridge for NVIDIA’s Holoscan

Kambiz Khalilian, Director of Strategic Initiatives and Ecosystem Alliances for Lattice Semiconductor, demonstrates the company’s latest edge AI and vision technologies and products at the 2024 Embedded Vision Summit. Specifically, Khalilian demonstrates a low-latency edge AI sensor bridge solution for NVIDIA’s Holoscan. The Lattice FPGA-based Holoscan Sensor Bridge enables high-throughput and low-latency sensor aggregation and

Lattice Semiconductor Demonstration of a Low-latency Edge AI Sensor Bridge for NVIDIA’s Holoscan Read More +

Lumotive Demonstration of a Sensor Hub for LiDAR, Radar and Camera Fusion

Kevin Camera, Vice President of Product for Lumotive, demonstrates the company’s latest edge AI and vision technologies and products in Lattice Semiconductor’s booth at the 2024 Embedded Vision Summit. Specifically, Camera demonstrates a sensor hub for concurrently fusing together and processing data from multiple sources: Velodyne’s VLP-16 LiDAR, Lumotive’s M30 solid state LiDAR, Texas Instruments’

Lumotive Demonstration of a Sensor Hub for LiDAR, Radar and Camera Fusion Read More +

Develop Generative AI-powered Visual AI Agents for the Edge

This blog post was originally published at NVIDIA’s website. It is reprinted here with the permission of NVIDIA. An exciting breakthrough in AI technology—Vision Language Models (VLMs)—offers a more dynamic and flexible method for video analysis. VLMs enable users to interact with image and video input using natural language, making the technology more accessible and

Develop Generative AI-powered Visual AI Agents for the Edge Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top