Ali Ors, Director of R&D, ADAS at NXP Semiconductors, demonstrates the company's latest embedded vision technologies and products at the May 2017 Embedded Vision Summit. Specifically, Ors demonstrates a real-time CNN image classifier and pedestrian detection on NXP’s automotive-grade S32V234 ADAS vision processor. He demonstrates an optimized implementation of a convolutional neural network used to classify objects in the images displayed randomly on-screen into one of 1,000 distinct classes, as well as highly accurate ACF (aggregate channel features)-based pedestrian detection. The applications run real-time on the APEX vision processing cores in the S32V234 SoC, and demonstrate that high frame rate, high performance deep learning inference engines can be achieved in embedded systems with very low power dissipation.