Sensors and Cameras

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique

Ben Bodley, CEO of Teknique, presents the “Designing Your Next Vision Product Using a Systems Approach,” tutorial at the May 2019 Embedded Vision Summit. Today it’s easier than ever to create a credible demo of a new smart camera product for a specific application. But the distance from a demo to a robust product is […]

“Designing Your Next Vision Product Using a Systems Approach,” a Presentation from Teknique Read More +

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler

Thies Möller, Technical Architect at Basler, presents the “Using Blockchain to Create Trusted Embedded Vision Systems” tutorial at the May 2019 Embedded Vision Summit. In many IoT architectures, sensor data must be passed to cloud services for further processing. Traditionally, “trusted third parties” have been used to secure this data. In this talk, Möller explores

“Using Blockchain to Create Trusted Embedded Vision Systems,” a Presentation from Basler Read More +

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies

Walter Bell, 3D Imaging Application Engineer at Infineon Technologies, presents the “REAL3 Time of Flight: A New Differentiator for Mobile Phones” tutorial at the May 2019 Embedded Vision Summit. In 2019, 3D imaging has become mainstream in mobile phone cameras. What started in 2016 with the first two smartphones using an Infineon 3D time-of-flight (ToF)

“REAL3 Time of Flight: A New Differentiator for Mobile Phones,” a Presentation from Infineon Technologies Read More +

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp

Oleg Sinyavskiy, Director of Research and Development at Brain Corp, presents the “Sensory Fusion for Scalable Indoor Navigation” tutorial at the May 2019 Embedded Vision Summit. Indoor autonomous navigation requires using a variety of sensors in different modalities. Merging together RGB, depth, lidar and odometry data streams to achieve autonomous operation requires a fusion of

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp Read More +

“Enabling the Next Kitchen Experience Through Embedded Vision,” a Presentation from Whirlpool

Sugosh Venkataraman, Vice President of Technology at Whirlpool, presents the “Enabling the Next Kitchen Experience Through Embedded Vision,” tutorial at the May 2019 Embedded Vision Summit. Our kitchens are the hubs where we spend quality time with family and friends, preparing and eating meals. Today, instructions for cooking a particular meal are just a few

“Enabling the Next Kitchen Experience Through Embedded Vision,” a Presentation from Whirlpool Read More +

“Applied Depth Sensing with Intel RealSense,” a Presentation from Intel

Sergey Dorodnicov, Software Architect at Intel, presents the “Applied Depth Sensing with Intel RealSense” tutorial at the May 2019 Embedded Vision Summit. As robust depth cameras become more affordable, many new products will benefit from true 3D vision. This presentation highlights the benefits of depth sensing for tasks such as autonomous navigation, collision avoidance and

“Applied Depth Sensing with Intel RealSense,” a Presentation from Intel Read More +

“Neuromorphic Event-based Vision: From Disruption to Adoption at Scale,” a Presentation from Prophesee

Luca Verre, Co-founder and CEO of Prophesee, presents the “Neuromorphic Event-based Vision: From Disruption to Adoption at Scale” tutorial at the May 2019 Embedded Vision Summit. Neuromorphic event-based vision is a new paradigm in imaging technology, inspired by human biology. It promises to dramatically improve machines’ ability to sense their environments and make intelligent decisions

“Neuromorphic Event-based Vision: From Disruption to Adoption at Scale,” a Presentation from Prophesee Read More +

“Deep Learning for Manufacturing Inspection Applications,” a Presentation from FLIR Systems

Stephen Se, Research Manager at FLIR Systems, presents the “Deep Learning for Manufacturing Inspection Applications” tutorial at the May 2019 Embedded Vision Summit. Recently, deep learning has revolutionized artificial intelligence and has been shown to provide the best solutions to many problems in computer vision, image classification, speech recognition and natural language processing. Se presents

“Deep Learning for Manufacturing Inspection Applications,” a Presentation from FLIR Systems Read More +

“Deploying Visual SLAM in Low-power Devices,” a Presentation from CEVA

Ben Weiss, Customer Solutions Engineer in the CSG Group at CEVA, presents the “Deploying Visual SLAM in Low-power Devices” tutorial at the May 2019 Embedded Vision Summit. Simultaneous localization and mapping (SLAM) technology has been evolving for quite some time, including visual SLAM, which relies primarily on image data. But implementing fast, accurate visual SLAM

“Deploying Visual SLAM in Low-power Devices,” a Presentation from CEVA Read More +

“Challenges and Approaches for Extracting Meaning from Satellite Imagery,” a Presentation from Orbital Insight

Adam Kraft, Deep Learning Engineer at Orbital Insight, presents the "Challenges and Approaches for Extracting Meaning from Satellite Imagery" tutorial at the May 2019 Embedded Vision Summit. Orbital Insight is a geospatial big data company leveraging the rapidly growing availability of satellite, UAV and other geospatial data sources to understand and characterize socioeconomic trends at

“Challenges and Approaches for Extracting Meaning from Satellite Imagery,” a Presentation from Orbital Insight Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top