Sharad Chole, Chief Scientist and Co-founder of Expedera, presents the “Using a Neural Processor for Always-sensing Cameras” tutorial at the May 2023 Embedded Vision Summit.
Always-sensing cameras are becoming a common AI-enabled feature of consumer devices, much like the always-listening Siri or Google assistants. They can enable a more natural and seamless user experience, such as automatically locking and unlocking the device based on whether the owner is looking at the screen or within view of the camera. But the complexities of cameras, and the quantity and richness of the data they produce, mean that much more processing is required for an always-sensing camera compared with listening for a wake word.
Without careful attention to neural processing unit (NPU) design, an always-sensing camera can wind up consuming excessive power or performing poorly, which can lead to an unsatisfactory user experience. In this talk, Chole explores the architecture of a neural processor in the image signal path, discusses use cases, and provides tips for how OEMs, chipmakers, and system architects can successfully evaluate, specify, and deploy an NPU in an always-on camera.
See here for a PDF of the slides.