Automotive

Automotive Applications for Embedded Vision

Vision products in automotive applications can make us better and safer drivers

Vision products in automotive applications can serve to enhance the driving experience by making us better and safer drivers through both driver and road monitoring.

Driver monitoring applications use computer vision to ensure that driver remains alert and awake while operating the vehicle. These systems can monitor head movement and body language for indications that the driver is drowsy, thus posing a threat to others on the road. They can also monitor for driver distraction behaviors such as texting, eating, etc., responding with a friendly reminder that encourages the driver to focus on the road instead.

In addition to monitoring activities occurring inside the vehicle, exterior applications such as lane departure warning systems can use video with lane detection algorithms to recognize the lane markings and road edges and estimate the position of the car within the lane. The driver can then be warned in cases of unintentional lane departure. Solutions exist to read roadside warning signs and to alert the driver if they are not heeded, as well as for collision mitigation, blind spot detection, park and reverse assist, self-parking vehicles and event-data recording.

Eventually, this technology will to lead cars with self-driving capability; Google, for example, is already testing prototypes. However many automotive industry experts believe that the goal of vision in vehicles is not so much to eliminate the driving experience but to just to make it safer, at least in the near term.

“Removing Weather-related Image Degradation at the Edge,” a Presentation from Rivian

Ramit Pahwa, Machine Learning Scientist at Rivian, presents the “Removing Weather-related Image Degradation at the Edge” tutorial at the May 2024 Embedded Vision Summit. For machines that operate outdoors—such as autonomous cars and trucks—image quality degradation due to weather conditions presents a significant challenge. For example, snow, rainfall and raindrops… “Removing Weather-related Image Degradation at

Read More »

D3 Embedded Introduces Camera Modules Based on Valens Semiconductor’s VA7000 MIPI A-PHY Chipsets

The integration of MIPI A-PHY into DesignCore® Series Cameras will accelerate time-to-market for customers developing performance-critical products for robotics, industrial vehicles, and other embedded vision applications. Rochester, NY – October 8th, 2024 – D3 Embedded, a global leader in embedded vision systems design and manufacturing, today announced that it has partnered with Valens Semiconductor, a

Read More »

MIPI Alliance Announces OEM, Expanded Ecosystem Support for MIPI A-PHY Automotive SerDes Specification

Global OEMs and other supply chain vendors embrace A-PHY to support next-generation ADAS and ADS applications BRIDGEWATER, N.J., October 9, 2024 – The MIPI Alliance, an international organization that develops interface specifications for mobile and mobile-influenced industries, today announced global OEMs and other automotive supply chain vendors have joined the growing ecosystem that is designing

Read More »

Nextchip Demonstration of a UHD Camera Reference Design Based On Its APACHE_U ISP

Barry Fitzgerald, local representative for Nextchip, demonstrates the company’s latest edge AI and vision technologies and products at the September 2024 Edge AI and Vision Alliance Forum. Specifically, Fitzgerald demonstrates a UHD camera reference design based on the company’s APACHE_U ISP in conjunction with an 8 Mpixel image sensor from fellow Alliance Member company Samsung.

Read More »

MIPI Alliance Releases A-PHY v2.0, Doubling Maximum Data Rate of Automotive SerDes Interface to Enable Emerging Vehicle Architectures

Industry-leading specification simplifies the integration of image sensors and displays to support next-generation ADAS and ADS applications BRIDGEWATER, N.J., Sept. 26, 2024 — The MIPI Alliance, an international organization that develops interface specifications for mobile and mobile-influenced industries, today announced the release of MIPI A-PHY v2.0, the next version of the automotive high-speed asymmetric serializer-deserializer

Read More »

Global Progress and Challenges for Autonomous Buses and Roboshuttles

In recent years, the promise of the public transport revolution has been teased by autonomous buses and roboshuttles. These technologies promise to deliver significant cost reductions for operators and alleviate labor pressures. Over 50 autonomous bus and roboshuttle players once competed in this space. However, as the autonomous driving industry evolved during 2022-2024, the commercialization

Read More »

“Future Radar Technologies and Applications,” a Presentation from IDTechEx

James Jeffs, Senior Technology Analyst at IDTechEx, presents the “Future Radar Technologies and Applications” tutorial at the May 2024 Embedded Vision Summit. Radar has value in a wide range of industries that are embracing automation, from delivery drones to agriculture, each requiring different performance attributes. Autonomous vehicles are perhaps one… “Future Radar Technologies and Applications,”

Read More »

Renesas Leads ADAS Innovation with Power-efficient 4th-generation R-Car Automotive SoCs

New R-Car V4M & V4H SoC Devices Target High-Volume L2 and L2+ ADAS Market While Maintaining Scalability and Software Reusability with Existing R-Car Devices TOKYO, Japan, September 24, 2024 ― Renesas Electronics Corporation (TSE:6723), a premier supplier of advanced semiconductor solutions, today expanded its popular R-Car Family of system-on-chips (SoCs) for entry-level Advanced Driver Assistance

Read More »

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top