Robotics

Robot Senses: Robots That Can See, Hear, Feel, and More

This market research report was originally published at Omdia | Tractica’s website. It is reprinted here with the permission of Omdia | Tractica. A robot with no way to sense its position or environment is simply an automaton that performs movements blindly. That is changing due to a trend toward adding vision, torque, and other […]

Robot Senses: Robots That Can See, Hear, Feel, and More Read More +

Sensors Are At the Heart Of the Robotic Mobility Disruption

OUTLINES: A new generation of robotic vehicle is bringing MaaS to the masses. High end sensor technology and raw computing power is at the center of this ongoing market disruption. Sensors for robotic vehicles will become industries of their own: 51% CAGR expected for the next 15 years. “Disruption is coming to our streets and

Sensors Are At the Heart Of the Robotic Mobility Disruption Read More +

A Faster, Simpler Path to AI-Enabled Robotics: The UP Squared RoboMaker Developer Kit, Powered by Intel and AWS

This blog post was originally published at Intel’s website. It is reprinted here with the permission of Intel. The pace of innovation is quickening across all industries, including manufacturing and the intelligent factories of Industry 4.0. Unlocking the value of data via machine learning is propelling us into this next phase, and innovations in industrial

A Faster, Simpler Path to AI-Enabled Robotics: The UP Squared RoboMaker Developer Kit, Powered by Intel and AWS Read More +

Robotic Components: The Building Blocks of Robotic Life

This market research report was originally published at Tractica’s website. It is reprinted here with the permission of Tractica. Much is written about robot systems, but what about the components that comprise these machines? Key technologies include the robot arms, grippers, actuators, sensors, vision systems, power systems, and controllers. The robotics market is experiencing a

Robotic Components: The Building Blocks of Robotic Life Read More +

“Fundamentals of Monocular SLAM,” a Presentation from Cadence

Shrinivas Gadkari, Design Engineering Director at Cadence, presents the “Fundamentals of Monocular SLAM” tutorial at the May 2019 Embedded Vision Summit. Simultaneous Localization and Mapping (SLAM) refers to a class of algorithms that enables a device with one or more cameras and/or other sensors to create an accurate map of its surroundings, to determine the

“Fundamentals of Monocular SLAM,” a Presentation from Cadence Read More +

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp

Oleg Sinyavskiy, Director of Research and Development at Brain Corp, presents the “Sensory Fusion for Scalable Indoor Navigation” tutorial at the May 2019 Embedded Vision Summit. Indoor autonomous navigation requires using a variety of sensors in different modalities. Merging together RGB, depth, lidar and odometry data streams to achieve autonomous operation requires a fusion of

“Sensory Fusion for Scalable Indoor Navigation,” a Presentation from Brain Corp Read More +

“Applied Depth Sensing with Intel RealSense,” a Presentation from Intel

Sergey Dorodnicov, Software Architect at Intel, presents the “Applied Depth Sensing with Intel RealSense” tutorial at the May 2019 Embedded Vision Summit. As robust depth cameras become more affordable, many new products will benefit from true 3D vision. This presentation highlights the benefits of depth sensing for tasks such as autonomous navigation, collision avoidance and

“Applied Depth Sensing with Intel RealSense,” a Presentation from Intel Read More +

“Visual AI Enables Autonomous Security,” an Interview with Knightscope

William “Bill” Santana Li, Co-founder, Chairman and CEO of Knightscope, talks with Vin Ratford, Executive Director of the Embedded Vision Alliance, for the “Visual AI Enables Autonomous Security” interview at the May 2019 Embedded Vision Summit. Knightscope, a physical security technologies company based in Silicon Valley, develops and sells a line of autonomous robots that

“Visual AI Enables Autonomous Security,” an Interview with Knightscope Read More +

“Selecting and Exploiting Sensors for Sensor Fusion in Consumer Robots,” a Presentation from Daniel Casner

Daniel Casner, formerly a systems engineer at Anki, presents the "Selecting and Exploiting Sensors for Sensor Fusion in Consumer Robots" tutorial at the May 2019 Embedded Vision Summit. How do you design robots that are aware of their unstructured environments at a consumer price point? Excellent sensing is required but using low cost sensors is

“Selecting and Exploiting Sensors for Sensor Fusion in Consumer Robots,” a Presentation from Daniel Casner Read More +

May 2019 Embedded Vision Summit Slides

The Embedded Vision Summit was held on May 20-23, 2019 in Santa Clara, California, as an educational forum for product creators interested in incorporating visual intelligence into electronic systems and software. The presentations delivered at the Summit are listed below. All of the slides from these presentations are included in… May 2019 Embedded Vision Summit

May 2019 Embedded Vision Summit Slides Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top