Meta is lastly going to let folks strive its splashiest AI options for the Meta Ray-Ban good glasses, although in an early entry check to begin. As we speak, Meta introduced that it’s going to begin rolling out its multimodal AI options that may inform you about issues Meta’s AI assistant can see and listen to by way of the digital camera and microphones of the glasses.
Mark Zuckerberg demonstrated the replace in an Instagram reel the place he requested the glasses to counsel pants that may match a shirt he was holding.
It responded by describing the shirt and providing a few ideas for pants which may complement it. He additionally had the glasses’ AI assistant translate textual content and showcase a few picture captions.
Zuckerberg revealed the multimodal AI options for Ray-Ban glasses like this in an interview with The Verge’s Alex Heath in a September Decoder interview. Zuckerberg mentioned that folks would discuss to the Meta AI assistant “all through the day about completely different questions you may have,” suggesting that it may reply questions on what wearers are or the place they’re.
The AI assistant additionally precisely described a lit-up, California-shaped wall sculpture in a video from CTO Andrew Bosworth. He defined a few of the different options, which embody asking the assistant to assist caption photographs you’ve taken or ask for translation and summarization — all pretty widespread AI options seen in different merchandise from Microsoft and Google.
The check interval will probably be restricted within the US to “a small quantity of people that decide in,” Bosworth mentioned. Directions for opting in could be discovered right here.