Snapdragon Powers the Future of AI in Smart Glasses. Here’s How

This blog post was originally published at Qualcomm’s website. It is reprinted here with the permission of Qualcomm.

A Snapdragon Insider chats with Qualcomm Technologies’ Said Bakadir about the future of smart glasses and Qualcomm Technologies’ role in turning it into a critical AI tool

Artificial intelligence (AI) is increasingly winding its way through our lives, and smart glasses will be the key to making the most of those changes. Smart glasses packed with cameras and microphones can become the eyes and ears of AI, turning them into a critical tool as digital assistants get smarter and more responsive.

But how quickly will smart glasses get, well, smart enough to handle generative AI that’s capable of creating its own original content? And what exactly will we be able to do with them?

As a member of the Snapdragon Insider program, I was fortunate enough to talk with Said Bakadir, senior director of product management for Qualcomm Technologies’ XR business. Throughout our nearly hour-long chat, Bakadir went through his vision of where smart glasses are headed, and how Qualcomm innovations are fueling their path to becoming mainstream devices.

The following is an edited transcript of our conversation.

Gaylon Haire (GH): Let’s hop right into it. I see you’re wearing the Ray-Ban Meta smart glasses right now. What’s your favorite thing about them?

Said Bakadir (SB): I use them as my everyday glasses. I walk around with them, listen to music and take calls. What I love the most about them are their AI features and their potential. And this is just the start.

GH: What do you think the future is like for devices like these — sleek, everyday wearable glasses powered by a Snapdragon processor, and what do you want to see in the long run?

SB: The North Star is glasses that you wear all day, and transition seamlessly into various depths of immersion, from contextual information popping into your field of view without disrupting what you do, to full immersion. And while we’ve made a lot of progress packing so much cool technology into a slim device like the Ray Bans Meta glasses, from a better camera to the ability to talk with a digital assistant, this is really the first step for smart glasses. The future will see glasses packing displays, for example.

You saw it with the first Android phone, which packed a 3.2-inch screen. Those screens got bigger and better. The phones increasingly packed more performance, power and intelligence. Smart glasses will evolve the same way.

GH: What do you think is more important with these wearable devices? Do you think the form factor is more important? Or the number of features that they’re able to have with AI?

SB: I would say it’s both. If you have a very good form factor, and there is no functionality, why would people use it?

Designing a form factor that is light and sleek comes with tradeoffs, so we need to be smart about how to deliver that value and find the right balance between form and function. In the near future, there won’t be one form factor or one balanced approach that fits everything.

For example, there will be situations, let’s say, in an enterprise setting, where I’d rather have a pair of powerful glasses deliver a lot of value and solve problems, but only lasts two hours. If I’m a store manager wearing glasses and looking around and I’ve got another set of eyes that are smarter than me spotting the problems during an inspection, that value trumps form factor.

GH: Right now, we’re in the infancy of AI-powered smart glasses. These are some of the first major mainstream ones. What do you think is the most underrated AI feature in these devices?

SB: Smart glasses are far more valuable thanks to the emergence of generative AI. Generative AI has already improved our lives, letting you do everything from creating an itinerary for an upcoming family trip to Japan or whipping up a personalized workout routine.

Generative AI is an engine to help me be smarter, help me be better.

Today, there are some challenges in interacting with generative AI: You need to take your phone, open an app, start typing or and take a photo. As an example, I’m counting my calories. Using generative AI-enabled glasses is seamless — I don’t need to pull out my phone and take a photo of each of my meals. Smart glasses are disruptive because they are the ears and eyes of generative AI, thus unlocking its full potential.

It’s going to open a wide range of use cases that we didn’t think about before. It’s combining my smartwatch, my ear buds, and it’s giving me an interface to the phone. I don’t need to pull my phone out of my pocket. So, our leading technology in AI will be critical for these glasses.

GH: What is being done to help lower the cost of these devices?

SB: Today, the Ray-Ban Meta glasses are priced at $299 (while Qualcomm Technologies is a key supplier for the glasses, the retail price is determined by Ray-Ban). This is within the ballpark of what you would even pay for traditional glasses. The way we frame this question is: how might our Snapdragon processor deliver the most value for manufacturers to build a device that will deliver on users’ expectations?

We want to make sure that these glasses are useful for people. With quality camera and audio, I can do visual search, and that is the eyes of generative AI. If it does its job properly, then people will consider choosing a smart pair of frames for their next pair of glasses.

GH: What type of technical challenges do you experience when trying to power these AI glasses? What do you think are the biggest speed bumps?

SB: Power and thermal. In glasses, you don’t have much space. It’ll heat up and you’ll feel it in your head.

Power and thermal are always the No. 1 line in every discussion. You want to add a better camera, you want to go higher resolution, what’s your budget in terms of how many hours you want to run, or how big is your battery?

This is the conundrum between finding the balance between power, form factor and functionality. It’ll always come down to these three things.

GH: What type of innovations has Qualcomm created to combat these types of issues?

SB: As of today, we are focused on building processors suited to power this new category of devices, and we’re building new chips from the ground up building off decades of investments in mobile, compute, AI and connectivity. There’s a lot of innovation, from a silicon perspective to the architecture and how you move data within the chip.

We also look at making sure our software tools are more efficient as AI grows and make them available to developers.

We believe that hybrid or distributed compute is at the heart of the future of smart glasses experiences. It isn’t one device that is everything. It’s what I can do on my device vs. the devices around me vs. on the cloud. The good thing is Qualcomm Technologies — and specifically Snapdragon processors — are all over this. A lot of innovation is going into how these devices work together and build a cohesive ecosystem of devices to deliver the best experience to users.

GH: That’s awesome. The biggest takeaway for me is a lot of this will be going on in the background, and you’re not even thinking about it. That’s a big deal, because your glasses are doing it for you. It’s basically like you have two brains on right now.

SB: Think about AI glasses as your personal, intuitive AI assistant. That’s where this is going. This is where value is. This is what excites me. It’s been almost 10 years in this industry, and I’m very happy to see the progress we’re making on this.

Gaylon Haire
Snapdragon Insider

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top