MathWorks' GPU Coder is the 2018 Vision Product of the Year Award Winner in the Software and Algorithms category. The new MathWorks® GPU Coder software enables scientists and engineers to automatically generate optimized CUDA code from high-level functional descriptions in MATLAB® for deep learning, embedded vision, and autonomous systems. The generated CUDA code, integrated in projects as source code or libraries, accelerates computationally intensive portions of the MATLAB code for modern GPUs including the NVIDIA Tesla®, embedded NVIDIA Tegra® System-on-Chip (SoC), NVIDIA Jetson- System-on-Modules (SoMs), and NVIDIA DRIVE- platforms. This automated workflow provides easy access to GPUs without requiring expert knowledge of GPU programming.
MathWorks provides software tools for embedded vision. MATLAB is a high-level language and interactive environment for algorithm development, data analysis, visualization, and numeric computing. Several key toolboxes extend MATLAB for embedded vision: Computer Vision System Toolbox, Image Processing Toolbox, and Statistics Toolbox. Together, these products enable you to work much faster than with traditional programming languages such as C or C++. Model-Based Design with MathWorks products enable the migration of embedded vision algorithms into real-world hardware while meeting tight constraints on performance, power, latency, and cost. Our tools support efficient design trade-off exploration and design verification. Using products such as MATLAB Coder, Simulink, Simulink Coder, Embedded Coder, and Simulink HDL Coder, engineers can migrate embedded vision algorithms from MATLAB and Simulink into embedded systems. To learn more about MathWorks, please visit www.mathworks.com.
Applications for the 2019 Vision Product of the Year Awards are now being accepted, until March 8, 2019. If you're an employee of an Embedded Vision Alliance member company, please see here for an online application form. If you're interested in becoming a member of the Embedded Vision Alliance, please see here for more information!