Meta is finally going to let people try its splashiest AI features for the Meta Ray-Ban smart glasses, though in an early access test to start. Today, Meta announced that it’s going to start rolling out its multimodal AI features that can tell you about things Meta’s AI assistant can see and hear through the camera and microphones of the glasses.
Mark Zuckerberg demonstrated the update in an Instagram reel where he asked the glasses to suggest pants that would match a shirt he was holding.
It responded by describing the shirt and offering a couple of suggestions for pants that might complement it. He also had the glasses’ AI assistant translate text and show off a couple of image captions.
Zuckerberg revealed the multimodal AI features for Ray-Ban glasses like this in an interview with The Verge’s Alex Heath in a September Decoder interview. Zuckerberg said that people would talk to the Meta AI assistant “throughout the day about different questions you have,” suggesting that it could answer questions about what wearers are looking at or where they are.
The AI assistant also accurately described a lit-up, California-shaped wall sculpture in a video from CTO Andrew Bosworth. He explained some of the other features, which include asking the assistant to help caption photos you’ve taken or ask for translation and summarization — all fairly common AI features seen in other products from Microsoft and Google.
The test period will be limited in the US to “a small number of people who opt in,” Bosworth said. Instructions for opting in can be found here.