Meta unveils new smart glasses with display and AI abilities
CEO Mark Zuckerberg showed off the device, which can be controlled with small hand movements, at the company‘s annual hardware and developer conference.
Meta Platforms announced a new pair of smart glasses with a small built-in display at the company’s annual hardware and developer conference on Wednesday.
The glasses, which can be controlled by small hand movements via a wrist strap called the Neural Band, mark the latest advancement in Meta’s push into hardware. The company, which is betting that smart glasses will be the next big computing device, has seen growing success in the category after a rocky start.
“This isn’t a prototype. This is here, it’s ready to go and you’re going to be able to buy it in a couple of weeks," Meta Chief Executive Mark Zuckerberg said during the keynote address at the conference on Wednesday evening. The glasses-and-neural-band set retails for $799 and goes on sale Sept. 30.
Zuckerberg opened the keynote wearing the smart glasses, giving a live demonstration of them as he walked onstage. Later in the address, he tried to do a video call with Meta Chief Technology Officer Andrew Bosworth via the glasses. It didn’t work the first time.
The company debuted its first generation of the smart glasses in collaboration with Ray-Ban owner EssilorLuxottica in 2021, to tepid interest. The glasses could take photos and record videos but lacked artificial-intelligence capabilities. In 18 months, Meta sold about 300,000 pairs, and less than 10% of purchases were still being actively worn two years later.
The second generation of the shades, which the company previously said was redesigned from the ground up with improved camera and audio capabilities and new AI tools, achieved more success.
EssilorLuxottica Chief Executive Francesco Milleri said on an earnings call earlier this year that the company had sold more than two million pairs of the Meta smart glasses since fall 2023, when the second-generation glasses went on sale, and that it was planning to expand its production capacity for the glasses to 10 million pairs annually by the end of 2026.
“It is no surprise that AI glasses are taking off," Zuckerberg said. “This feeling of presence is a profound thing, and we’ve lost it a little bit with phones and we have the opportunity to get it back."
At the Meta Connect conference on Wednesday, the company also unveiled an updated version of its existing Ray-Ban smart glasses and a new line of smart glasses designed for athletes in collaboration with Oakley, a brand that is also owned by EssilorLuxottica. The shades have an AI assistant built into them that can see and hear the user’s surroundings through a camera and speakers. They can take photos and record videos, play music and talk to the wearer.
Zuckerberg announced a new feature for the glasses called “conversation focus" that can amplify a user’s friend’s voice and drown out other noises and sounds.
Companies in Silicon Valley are racing to develop AI-enabled devices that can either replace—or, more likely, complement—the smartphone.
The Meta smart glasses need to be connected to a user’s phone to upload photos and videos. Competitor OpenAI is working on an AI “companion" with former Apple designer Jony Ive. OpenAI CEO Sam Altman and Ive have told employees the device will be capable of being fully aware of a user’s surroundings and life, be unobtrusive, be able to rest in one’s pocket or on one’s desk and will be a third core device a person would put on a desk, after a laptop and iPhone.
In May, Google announced a new partnership with Warby Parker and luxury fashion house Kering to develop AI-powered glasses. The first line of products is expected to launch after 2025. Snap plans to launch holographic, augmented-reality glasses to the public next year, and Amazon.com sells glasses equipped with audio-calling and music-listening capabilities.
Write to Meghan Bobrowsky at meghan.bobrowsky@wsj.com
