Meta to Launch AI-Powered Smart Glasses at Meta Connect Conference

Meta CEO Mark Zuckerberg tests the Orion prototype augmented reality glasses during Meta Connect in Menlo Park, California, on September 25, 2024. Manuel Orbegozo/Reuters

As noted by CNN

In July, Mark Zuckerberg, the CEO of Meta, voiced a bold opinion: people without smart glasses may feel a significant cognitive imbalance compared with those who use them. This week, at the Meta Connect conference, announcements are expected that could shape the course of the company’s future technologies.

According to reports, Meta intends to unveil a new pair of AI-powered glasses, continuing the Ray-Ban line – already known for its ability to understand the surrounding environment and respond to user questions. Such a move would be a logical continuation of the metaverse strategy, but with a focus on everyday user experience.

Smart glasses remain one of the few bright spots in Meta’s overall strategic picture, which previously endured several hardware missteps: first the company failed with phones, and in 2021 it attempted to pivot the brand toward a metaverse that did not fully meet expectations.

However, Ray-Ban glasses have started to deliver revenue: according to EssilorLuxottica, in July revenue from Meta’s smart glasses more than tripled from the previous year, and market research points to Meta’s leadership in this category. While widespread mass popularity of smart glasses may not come quickly, they give Meta an opportunity to reach consumers directly, reducing dependence on mobile devices. This is also an important step in developing a ‘personal superintelligence’ – a concept Zuckerberg has described as an AI that deeply understands us, supports our goals, and helps us achieve them.

Challenges and Opportunities for the Future

Industry pressure is mounting: Samsung, Google and Snap – longtime Meta competitors in social networks – are preparing to release their own smart glasses, and Amazon is also exploring augmented reality glasses, according to The Information. In 2021, Ray-Ban Stories were focused on hands-free photo and video capture, though this was not the first attempt to create a device for calls and music. This direction must overcome typical barriers of design, cost, and functionality.

Historically, Google Glass, introduced in 2013, became one of the first examples of this form factor, but did not achieve broad popularity. The decisive factors were price, ergonomics, and usefulness. Now industry leaders expect the next wave of smart glasses to be quite different – with a more advanced display, better battery life, and a more affordable price.

The main driving force is artificial intelligence. The integration of assistants, contextual responses, and adaptive services makes the glasses more practical: from sensing the surrounding environment to instant on-site translation of text.

According to Bloomberg, Meta is developing smart glasses with a display for viewing apps and notifications, as well as a convenient strap for gesture control. It is expected that these glasses will be unveiled at the Connect conference with a focus on ‘the latest innovations in AR glasses’ along with other novelties. Previously, Meta also demonstrated the Orion prototype for augmented reality.

The company has not commented on possible plans for announcements during the event. As of today, Ray-Ban Meta devices no longer have a display on the lenses, and users rely on audio interaction or a mobile app. Such solutions may be a weak point compared to models with their own displays.

This will be another challenge. If you want to replace your smartphone, will you truly be able to do so without any visual feedback?

– Giyom Shansen, analyst at Counterpoint Research

Despite this, Meta’s Reality Labs reported significant expenditures in the latest financial quarter, but the company remains optimistic: smart glasses could change how Meta distributes its services, reducing reliance on third-party hardware and providing users with a more integrated experience.

In the long term, this means greater control over user interactions and a potential shift in Meta’s business model – from building apps for third-party devices to offering a full end-to-end solution through its own glasses.

Mark Zuckerberg has long spoken about the role of glasses in the company’s future, and now Meta aims to turn those words into a practical path toward reducing reliance on smartphones and building a more integrated ecosystem where intelligent capabilities are available right through your eyes.

Other news you may find interesting: