Meta's Ray-Ban Smart Glasses Get Smarter with Multimodal AI Features
Meta is now letting Ray-Ban Meta Smart Glasses (review) users try out new multimodal AI features on its second-gen wearable smart glasses. The new capabilities include identifying objects seen by the glasses' camera as well as translation and generative texts. The roll-out will be available first in the US in an opt-in program.
- Also read: Best stand-alone VR headsets to buy in 2023
As demoed by Mark Zuckerberg in an Instagram post, you can summon the Meta assistant through a “Hey Meta, look and…” command and ask the AI with your questions. For instance, he asked the assistant to give suggestions on which pants to wear with striped long sleeves.
A subsequent voice prompt can be heard from the glasses' speakers and describes the shirt before adding recommendations in the form of audio and visuals through the mobile app.
There are also examples where the CEO asked for a caption on an image of a dog wearing a costume along with Spanish texts shown and translated in real-time by the assistant. Additionally, the AI can also identify objects like a fruit or even a complex structure as shown by Meta's CTO Andrew Bosworth.
According to the company, the AI functions on the Ray-Ban Meta Smart Glasses are based on the company's multimodal generative AI systems, which combines different type of input to generate outputs. Meta added they put safeguards to avoid abusive and harmful answers.
If you're in the States, you can opt to join through an early access program. However, it was noted by Bosworth that the number of users will be limited. There is no word when Meta plans to expand the AI assistant in other countries.
Meta also said that it has started rolling out the real-time searches on the Ray-Ban Meta Smart Glasses 2. The search engine is powered by Bing and this supports asking the glasses about the latest scores of your favorite teams and points of interest nearby, among others.
Do you think these Meta AI features make the Ray-Ban smart glasses a worthwhile investment? We look forward to your answers in the comment section.
Recommended editorial content
With your consent, external content is loaded here.
By clicking on the button above, you agree that external content may be displayed to you. Personal data may be transmitted to third-party providers in the process. You can find more information about this in our Privacy Policy.