November 23, 2024
Meta AI on Ray-Ban Smart Glasses Can Now Understand What You’re Seeing
Meta will begin allowing select customers to put on its pair of Ray-Ban smart glasses and try out bespectacled AI-powered experiences as part of an early access programme. The Facebook parent has announced initial user tests for the smart glasses and intends to gather feedback on features ahead of release. Meta is also introducing updates to improve the Ray-Ban smart ...

Meta is now allowing select customers to put on its pair of Ray-Ban smart glasses and try out new bespectacled AI-powered experiences as part of an early access programme. The Facebook parent has announced initial user tests for the smart glasses and intends to gather feedback on new features ahead of wider release. Meta is also introducing updates to improve the Ray-Ban smart glasses experience, which is powered by the Meta AI assistant, bringing smarter and more helpful responses.

Earlier this month, Meta announced a host of new features to its AI services across platforms. In an update to the same blog post on Tuesday, the company introduced a few new Meta Ray-Ban smart glass features. Those who sign up for early access can try out multimodal AI-powered capabilities, which allow the smart glasses to perceive visual information by looking and answering related queries.

According to Meta, its AI assistant on the glasses can take a picture of what you’re seeing, either via voice command or the dedicated capture button. It can also come up with a witty caption for the photo. Users could pick up an object while wearing the Meta Ray-Ban smart glasses and ask for information on the same, or look at a sign in a different language and ask the AI-powered glasses to translate it to English. The company, however, has warned users that its multimodal AI might make mistakes and will be improved over time with the help of feedback.

Meta CEO Mark Zuckerberg demonstrated the look and ask feature on the AI smart glasses in an Instagram post. In the video, taken from the first-person perspective from the glasses, Zuckerberg picks out a striped, dark shirt and asks Meta AI to suggest pants to go with it.

Additionally, Meta says it is rolling out Bing-powered real time information capabilities on Meta AI-powered smart glasses. “You can ask Meta AI about sports scores or information on local landmarks, restaurants, stocks and more,” the company said in its update.

The look and ask with Meta AI feature on the glasses takes a picture when prompted to “look” and delivers an audio-based response to the related query. Do note that all pictures taken and processed by AI are stored and used to train Meta AI and other Meta products, which would likely spark a privacy concern. Meta says that the information collected, used and retained will comply with Meta’s Privacy Policy.

The early access programme is now live for Ray-Ban Meta smart glasses owners in the US and interested users can enroll for the same on the Meta View app on iOS and Android. To sign up, tap the settings button in the bottom right of the Meta View app, swipe down and tap Early Access. You’d also have to make sure that the smart glasses and the Meta View app have received the latest update.

Ray-Ban Meta smart glasses were launched in September, alongside the Meta Quest 3 and other Meta products. The glasses are powered by Qualcomm Snapdragon AR1 Gen1 Platform SoC and come with a 12-megapixel sensor, an LED light, and 32GB of inbuilt storage.

Ray-Ban Meta smart glasses with standard lenses are priced at $299 (roughly Rs. 24,999), while the pair with Polarized lenses and transition lenses cost $329 (roughly Rs. 27,400) and $379 (roughly Rs. 31,500), respectively. The glasses are available to buy in 15 countries, including the US, Canada, Australia, and European markets. Meta has not announced a launch date for the Indian market yet.


Affiliate links may be automatically generated – see our ethics statement for details.