Embedded Vision Insights: June 5, 2012 Edition

EVA180x100

In this edition of Embedded Vision Insights:

LETTER FROM THE EDITOR

Dear Colleague,

Rarely is it the case that I'm able to construct an entire newsletter around a common concept. But this time, the planets aligned. Multiple pieces of new content have appeared on the Embedded Vision Alliance website in recent weeks, all focused on ADAS (automotive driver assistance systems), thereby enabling me to publish a "theme" edition. And the popularity of this budding application shouldn't be a surprise to embedded vision industry observers and participants, particularly if you've been tracking the Alliance website as it has expanded over time.

In late March, after all, Analog Devices unveiled a family of four new multi-core DSPs, two of which contain embedded vision co-processors and specifically target the ADAS application space. Last November, fellow Alliance Platinum member Xilinx published a detailed application note on the opportunities for FPGAs in ADAS systems, suggesting a corporate focus on this market segment, too. IMS Research senior analyst Tom Hackenberg spoke at length about promising ADAS opportunities, fueled in no small part by pending legislation both in the United States and elsewhere, during his market trends presentation at the most recent Embedded Vision Alliance Summit. And as you'll soon see, Platinum member Texas Instruments is bullish on ADAS, too, as are many other companies in the Alliance.

Some of you may already have rear-view cameras in your vehicles, although their capabilities are likely currently quite "dumb"; they simply display their view on an LCD and rely on you to detect and react to objects behind you. But, as some luxury automobiles already implement (and aggressively promote), in-car cameras are poised to explode both in their functions and their per-vehicle count, as well as to migrate beyond high-end models into mainstream vehicles. The potential operating modes are myriad, both standalone and paired with synergistic technologies such as infrared, radar, and ultrasound, and in various implementation forms:

  • Rear collision warning
  • Front collision warning and active avoidance (i.e. automatic braking)
  • Driver distraction and drowsiness alerts
  • Inadvertent lane change warning and active avoidance (i.e. steering override)
  • Adaptive cruise control
  • Headlight high beam auto-disable, and
  • Roadway sign discernment and alerts (i.e. excessive speed warnings, road construction heads-ups, and the like)

For more information on this promising embedded vision application, I encourage you to check out the newly published article listed below from Analog Devices, as well as a brand new white paper from Texas Instruments. Spend some time, too, watching my recently published video interview with IMS Research senior market analyst Helena Perslow, who specializes in various automotive technologies. And in another showcased video, Embedded Vision Alliance founder Jeff Bier demonstrates the Mobileye vehicle safety system, which Perslow mentions in her discussion with me, and which Bier has installed in the family minivan.

These are exciting times for embedded vision, both in ADAS and elsewhere. To wit, I'd like to welcome videantis GmbH, the Embedded Vision Alliance's newest member, to our ever-expanding and vibrant organization. To keep on top of industry and Alliance developments as they occur, I encourage you to rely not just on this twice-monthly newsletter but to also subscribe to the Embedded Vision Alliance website's RSS feed and to the Alliance's various social media channels on Facebook, LinkedIn, and Twitter, all of which are updated each time a news writeup or other piece of content is added to the site. As always, I welcome your feedback on how the Embedded Vision Alliance can better serve your needs. Thank you for your support of the Alliance, and for your interest in and contributions to embedded vision technologies, products and applications.

Brian Dipert
Editor-In-Chief, Embedded Vision Alliance

FEATURED VIDEOS

An Introduction to the Market for Embedded Vision in Automotive Driver Assistance Applications
Brian Dipert interviews Helena Perslow, Senior Market Analyst at IMS Research, in this second video of a planned series with various IMS Research analysts. Brian and Helena discuss (among other things) the various applications for embedded vision in ADAS applications, the degree to which various ADAS systems may share a common camera versus requiring dedicated image sensors/processors/etc, the alternative technologies capable of implementing various ADAS functions and how a vision-based approach compares against (and in some cases complements) them, sizes of both the overall market and subsets, and various market trends.

A Demonstration of the Mobileye Vehicle Safety System
Jeff Bier takes to the streets to show the Embedded Vision Alliance community that he is living the embedded vision lifestyle, with a demonstration of the Mobileye system he's installed in the family minivan.

More Videos

FEATURED ARTICLES

Analog Devices Enables Mass Deployments of Camera-Based ADAS
Making the automotive environment safer by reducing injuries and fatalities is always a hot topic of the automotive industry, and an aspiration that's only gated by the availability of commercially deployable technology. Active Safety Systems, also known as ADAS (advanced driver assistance systems), present a major emerging market trend. The next major technology innovation after ABS (anti-lock braking systems) and stability control systems, which can be considered standard today, ADAS is rapidly gaining adoption. Analog Devices has just introduced the Blackfin BF60x family of processors with a "Pipelined Vision Processor" (PVP) targeted at automotive ADAS vision applications in support of the newly increased Euro NCAP car safety demands, while enabling customer products to achieve ISO26262 compliance. More

"Get Smart" With TI's Embedded Analytics Technology
When a driver starts a car, he doesn't think about starting an intelligent analytics system; sometimes, however, that's precisely what he's doing. In the future, we will encounter intelligent systems more often as embedded analytics is added to applications such as automotive vision, security and surveillance systems, industrial and factory automation, and a host of other consumer applications. High-performance, programmable and low-power DSPs (digital signal processors) are providing the foundation for a new wave of embedded analytics systems capable of gathering data on their own, processing it in real time, reaching conclusions and taking actions. More

More Articles

FEATURED NEWS

Microsoft's Kinect SDK v1.5: Now Live

SceneTap: A Face Detection Privacy Flap

RecognizeMe: Underwhelming Facial Recognition, As Least Initially

The IEEE: Covering Embedded Vision Extensively

The Evolution Of Gesture Interfaces: Leap Motion Achieves A Press Coverage Clean Sweep

More News

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top