Keynote

“Multimodal LLMs at the Edge: Are We There Yet?,” An Embedded Vision Summit Expert Panel Discussion

Sally Ward-Foxton, Senior Reporter at EE Times, moderates the “Multimodal LLMs at the Edge: Are We There Yet?” Expert Panel at the May 2024 Embedded Vision Summit. Other panelists include Adel Ahmadyan, Staff Engineer at Meta Reality Labs, Jilei Hou, Vice President of Engineering and Head of AI Research at… “Multimodal LLMs at the Edge: […]

“Multimodal LLMs at the Edge: Are We There Yet?,” An Embedded Vision Summit Expert Panel Discussion Read More +

“Understand the Multimodal World with Minimal Supervision,” a Keynote Presentation from Yong Jae Lee

Yong Jae Lee, Associate Professor in the Department of Computer Sciences at the University of Wisconsin-Madison and CEO of GivernyAI, presents the “Learning to Understand Our Multimodal World with Minimal Supervision” tutorial at the May 2024 Embedded Vision Summit. The field of computer vision is undergoing another profound change. Recently,… “Understand the Multimodal World with

“Understand the Multimodal World with Minimal Supervision,” a Keynote Presentation from Yong Jae Lee Read More +

“Scaling Vision-based Edge AI Solutions: From Prototype to Global Deployment,” a Presentation from Network Optix

Maurits Kaptein, Chief Data Scientist at Network Optix and Professor at the University of Eindhoven, presents the “Scaling Vision-based Edge AI Solutions: From Prototype to Global Deployment” tutorial at the May 2024 Embedded Vision Summit. The Embedded Vision Summit brings together innovators in silicon, devices, software and applications and empowers… “Scaling Vision-based Edge AI Solutions:

“Scaling Vision-based Edge AI Solutions: From Prototype to Global Deployment,” a Presentation from Network Optix Read More +

“What’s Next in On-device Generative AI,” a Presentation from Qualcomm

Jilei Hou, Vice President of Engineering and Head of AI Research at Qualcomm Technologies, presents the “What’s Next in On-device Generative AI” tutorial at the May 2024 Embedded Vision Summit. The generative AI era has begun! Large multimodal models are bringing the power of language understanding to machine perception, and transformer models are expanding to

“What’s Next in On-device Generative AI,” a Presentation from Qualcomm Read More +

“Generative AI: How Will It Impact Edge Applications and Machine Perception?,” An Embedded Vision Summit Expert Panel Discussion

Sally Ward-Foxton, Senior Reporter at EE Times, moderates the “Generative AI: How Will It Impact Edge Applications and Machine Perception?” Expert Panel at the May 2023 Embedded Vision Summit. Other panelists include Greg Kostello, CTO and Co-Founder of Huma.AI, Vivek Pradeep, Partner Research Manager at Microsoft, Steve Teig, CEO of Perceive, and Roland Memisevic, Senior

“Generative AI: How Will It Impact Edge Applications and Machine Perception?,” An Embedded Vision Summit Expert Panel Discussion Read More +

“Frontiers in Perceptual AI: First-person Video and Multimodal Perception,” a Keynote Presentation from Kristen Grauman

Kristen Grauman, Professor at the University of Texas at Austin and Research Director at Facebook AI Research, presents the “Frontiers in Perceptual AI: First-person Video and Multimodal Perception” tutorial at the May 2023 Embedded Vision Summit. First-person or “egocentric” perception requires understanding the video and multimodal data that streams from wearable cameras and other sensors.

“Frontiers in Perceptual AI: First-person Video and Multimodal Perception,” a Keynote Presentation from Kristen Grauman Read More +

“Accelerating the Era of AI Everywhere,” An Embedded Vision Summit Expert Panel Discussion

Jeff Bier, Founder of the Edge AI and Vision Alliance, moderates the “Accelerating the Era of AI Everywhere,” Expert Panel at the May 2023 Embedded Vision Summit. Other panelists include Dean Kamen, Founder of DEKA Research and Development, Lokwon Kim, CEO of DEEPX, Jason Lavene, Director of Advanced Development Engineering at Keurig Dr Pepper, and

“Accelerating the Era of AI Everywhere,” An Embedded Vision Summit Expert Panel Discussion Read More +

“Event-Based Neuromorphic Perception and Computation: The Future of Sensing and AI,” a Keynote Presentation from Ryad Benosman

Ryad Benosman, Professor at the University of Pittsburgh and Adjunct Professor at the CMU Robotics Institute, presents the “Event-Based Neuromorphic Perception and Computation: The Future of Sensing and AI” tutorial at the May 2022 Embedded Vision Summit. We say that today’s mainstream computer vision technologies enable machines to “see,” much as humans do. We refer

“Event-Based Neuromorphic Perception and Computation: The Future of Sensing and AI,” a Keynote Presentation from Ryad Benosman Read More +

“Powering the Connected Intelligent Edge and the Future of On-Device AI,” a Presentation from Qualcomm

Ziad Asghar, Vice President of Product Management at Qualcomm, presents the “Powering the Connected Intelligent Edge and the Future of On-Device AI” tutorial at the May 2022 Embedded Vision Summit. Qualcomm is leading the realization of the “connected intelligent edge,” where the convergence of wireless connectivity, efficient computing and distributed AI will power the devices

“Powering the Connected Intelligent Edge and the Future of On-Device AI,” a Presentation from Qualcomm Read More +

“How Do We Enable Edge ML Everywhere? Data, Reliability and Silicon Flexibility,” a Presentation from Edge Impulse

Zach Shelby, Co-founder and CEO of Edge Impulse, presents the “How Do We Enable Edge ML Everywhere? Data, Reliability and Silicon Flexibility” tutorial at the May 2022 Embedded Vision Summit. In this talk, Shelby reveals insights from the company’s recent global edge ML developer survey, which identified key barriers to machine learning adoption, and shares

“How Do We Enable Edge ML Everywhere? Data, Reliability and Silicon Flexibility,” a Presentation from Edge Impulse Read More +

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top