Embedded Vision Insights: February 25, 2014 Edition

EVA180x100

In this edition of Embedded Vision Insights:

LETTER FROM THE EDITOR

Dear Colleague,

In the February 11 edition of this newsletter, I told you about the morning keynote presentation planned for the May 29 Embedded Vision Summit West, by Yann LeCun, Director of AI Research at Facebook and a distinguished professor at New York University. I now have the pleasure of telling you about the event's afternoon keynote presentation. It's by Nathaniel Fairfield, the technical lead on Google's self-driving car team, which you can learn more about by watching the "Self Driving Car Test: Steve Mahan" video on YouTube. LeCun and Fairfield's presentations will respectively address the two foundation themes of the Summit, recognition and autonomy.

The Embedded Vision Summit West, a technical educational forum for engineers interested in incorporating visual intelligence into electronic systems and software, will take place  in Santa Clara, California. The "Early Bird" reduced registration fee of $149 is only available through this Friday, so don't delay in registering. This year’s Summit is co-located with the Augmented World Expo (AWE), and Embedded Vision Summit attendees may obtain an AWE Exhibits-Only Pass for only $20, an 80% discount. Just add the AWE Exhibits-Only Pass to your online registration submission for the Embedded Vision Summit. And it's not too early to begin booking your transportation and hotel reservations, either; for assistance in these matters, see the newly launched Summit travel page.

Nearer term, the next Alliance Member Meeting will be held at Qualcomm's corporate headquarters campus in San Diego, CA on March 13. We will be expanding the attendee list for the afternoon portion of the day's program, and have a limited amount of available space for guests who are not affiliated with Alliance Member companies. This will be a unique opportunity to network with Alliance member company representatives, learn about the products and services available to assist you in completing embedded vision-inclusive designs, and evaluate possible Alliance membership for your own company.

Planned afternoon presentations include:

  • "Vision-based Navigation Applications: from Planetary Exploration to Consumer Devices" by Larry Matthies, Supervisor in the Computer Vision Group at NASA’s Jet Propulsion Laboratory
  • "Who Watches the Machines Watching You? Regulating Privacy in the Era of Machines That See" by Brian Wassom, Partner and Chair of the Social, Mobile, and Emerging Media Practice Group at Honigman Miller Schwartz and Cohn LLP
  • "Recent Developments in Khronos Standards for Embedded Vision" by Neil Trevett, President of Khronos, and
  • "Forecasting Consumer Adoption of Embedded Vision in Mobile Devices in 2014" by John Feland, PhD, CEO of Argus Insights

You'll also have the opportunity to audition Alliance member companies' technology and product demonstrations during the mid-afternoon break. If you're interested in attending the afternoon portion of the March 13 Alliance Member Meeting, please email your credentials to [email protected] for consideration. Priority will be given to applicants currently working with embedded vision technology. Thanks as always for your support of the Embedded Vision Alliance, and for your interest in and contributions to embedded vision technologies, products and applications. And please don't hesitate to let me know how the Alliance can better service your needs.

Brian Dipert
Editor-In-Chief, Embedded Vision Alliance

FEATURED VIDEOS

Embedded Vision Summit Technical Presentation: "Software Approach for Easing Embedded Acceleration of OpenCV Applications," Mark Kulaczewski, videantis
Mark Kulaczewski, VP of System Integration and Co-Founder at videantis, presents the "Software Approach for Easing Embedded Acceleration of OpenCV Applications" tutorial within the "Developing Vision Software, Accelerators and Systems" technical session at the April 2013 Embedded Vision Summit.

Embedded Vision Alliance Member Meeting Product Demonstration: SoftKinetic
Tim Droz, Vice President of U.S. Operations at SoftKinetic, demonstrates a depth-sensing camera-enhanced Oculus Rift virtual reality headset (worn by fellow Alliance member Gary Brown of Cadence) at the December 2013 Embedded Vision Alliance Member Meeting.

More Videos

FEATURED ARTICLES

Improved Vision Processors, Sensors Enable Proliferation of New and Enhanced ADAS Functions
Thanks to the emergence of increasingly capable and cost-effective processors, image sensors, memories and other semiconductor devices, along with robust algorithms, it's now practical to incorporate computer vision into a wide range of embedded systems, enabling those systems to analyze their environments via video and still image inputs. Automotive ADAS (advanced driver assistance systems) designs are among the early success stories in the burgeoning embedded vision era, and their usage is rapidly expanding beyond high-end vehicles into high-volume mainstream implementations, and into a diversity of specific ADAS applications. More

The Challenge of Airport Perimeter Security Highlighted By Christmas Day Breaches
Protecting the perimeter of airports remains a difficult task across the US. On December 25th 2013, perimeter fence breaches occurred at two separate airports, Newark, New Jersey, and Phoenix, Arizona. In 2013, IHS estimated electronic perimeter security at airports in the US to be worth about $25 million for the hardware and software alone, or about 9% of the perimeter security market in the US. More

More Articles

FEATURED NEWS

CEVA Enriches its Computer Vision Software Library for Flagship CEVA-MM3101 Imaging and Vision Platform

Imagination Releases First IP Core Based on Innovative PowerVR Raptor Imaging Processor Architecture

videantis Showcases Unified Video/Vision Processor Solution at Mobile World Congress

3D Printing: A "Killer" App for Cameras Capable of Depth Sensing?

IEEE Publications: Excellent Sources of Embedded Vision Information

More News

 

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top