Tools

“Training CNNs for Efficient Inference,” a Presentation from Imagination Technologies

Paul Brasnett, Principal Research Engineer at Imagination Technologies, presents the "Training CNNs for Efficient Inference" tutorial at the May 2017 Embedded Vision Summit. Key challenges to the successful deployment of CNNs in embedded markets are in addressing the compute, bandwidth and power requirements. Typically, for mobile devices, the problem lies in the inference, since the […]

“Training CNNs for Efficient Inference,” a Presentation from Imagination Technologies Read More +

“Training CNNs for Efficient Inference,” a Presentation from Imagination Technologies

Paul Brasnett, Principal Research Engineer at Imagination Technologies, presents the "Training CNNs for Efficient Inference" tutorial at the May 2017 Embedded Vision Summit. Key challenges to the successful deployment of CNNs in embedded markets are in addressing the compute, bandwidth and power requirements. Typically, for mobile devices, the problem lies in the inference, since the

“Training CNNs for Efficient Inference,” a Presentation from Imagination Technologies Read More +

“Designing Deep Neural Network Algorithms for Embedded Devices,” a Presentation from Intel

Minje Park, Software Engineering Manager at Intel, presents the "Designing Deep Neural Network Algorithms for Embedded Devices" tutorial at the May 2017 Embedded Vision Summit. Deep neural networks have shown state-of-the-art results in a variety of vision tasks. Although accurate, most of these deep neural networks are computationally intensive, creating challenges for embedded devices. In

“Designing Deep Neural Network Algorithms for Embedded Devices,” a Presentation from Intel Read More +

“Designing Deep Neural Network Algorithms for Embedded Devices,” a Presentation from Intel

Minje Park, Software Engineering Manager at Intel, presents the "Designing Deep Neural Network Algorithms for Embedded Devices" tutorial at the May 2017 Embedded Vision Summit. Deep neural networks have shown state-of-the-art results in a variety of vision tasks. Although accurate, most of these deep neural networks are computationally intensive, creating challenges for embedded devices. In

“Designing Deep Neural Network Algorithms for Embedded Devices,” a Presentation from Intel Read More +

“Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP,” a Presentation from Google

Pete Warden, Research Engineer at Google, presents the "Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP" tutorial at the May 2017 Embedded Vision Summit. TensorFlow is Google’s second-generation deep learning software framework. TensorFlow was designed from the ground up to enable efficient implementation of deep learning algorithms at different scales, from high-performance data

“Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP,” a Presentation from Google Read More +

“Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP,” a Presentation from Google

Pete Warden, Research Engineer at Google, presents the "Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP" tutorial at the May 2017 Embedded Vision Summit. TensorFlow is Google’s second-generation deep learning software framework. TensorFlow was designed from the ground up to enable efficient implementation of deep learning algorithms at different scales, from high-performance data

“Implementing the TensorFlow Deep Learning Framework on Qualcomm’s Low-power DSP,” a Presentation from Google Read More +

Figure2

Software Frameworks and Toolsets for Deep Learning-based Vision Processing

This article provides both background and implementation-detailed information on software frameworks and toolsets for deep learning-based vision processing, an increasingly popular and robust alternative to classical computer vision algorithms. It covers the leading available software framework options, the root reasons for their abundance, and guidelines for selecting an optimal approach among the candidates for a

Software Frameworks and Toolsets for Deep Learning-based Vision Processing Read More +

May 2017 Embedded Vision Summit Vision Tank Competition Finalist Presentations

Adam Rowell, CTO of Lucid VR, Nitsa Einan, VP of Business Development at Imagry, Anthony Ashbrook, Founder and CEO of Machines With Vision, Grace Tsai, Founding Engineer at PerceptIn, and Grégoire Gentil, Founder of Always Innovating, deliver their Vision Tank finalist presentations at the May 2017 Embedded Vision Summit. The Vision Tank, a unique spin

May 2017 Embedded Vision Summit Vision Tank Competition Finalist Presentations Read More +

“OpenCV on Zynq: Accelerating 4k60 Dense Optical Flow and Stereo Vision,” a Presentation from Xilinx

Nick Ni, Senior Product Manager for SDSoC and Embedded Vision at Xilinx, presents the "OpenCV on Zynq: Accelerating 4k60 Dense Optical Flow and Stereo Vision" tutorial at the May 2017 Embedded Vision Summit. OpenCV libraries are widely used for algorithm prototyping by many leading technology companies and computer vision researchers. FPGAs can achieve unparalleled compute

“OpenCV on Zynq: Accelerating 4k60 Dense Optical Flow and Stereo Vision,” a Presentation from Xilinx Read More +

May 2017 Embedded Vision Summit Vision Entrepreneurs’ Panel

Chris Rowen, CEO of Cognite Ventures, moderates the Vision Entrepreneurs' Panel at the May 2017 Embedded Vision Summit. Other panelists include Mark Bowles, founder of ecoATM; Michael Tusch, CEO; and Remi El-Ouazzane, CEO of Movidius (an Intel company). What can we learn from leaders of successful vision-based start-ups? The expanding applications of embedded vision are

May 2017 Embedded Vision Summit Vision Entrepreneurs’ Panel Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top