Ray-Ban Meta Glasses Get AI Video and Live Translation Upgrades
Meta upgraded the Ray-Ban Meta smart glasses on Monday, announcing the addition of new artificial intelligence (AI) features. These upgrades were first revealed at Meta’s annual Connect conference in September. Notably, the advanced AI capabilities for the smart glasses were initially shown exclusively to members of Facebook’s “Early Access Program.”
New AI Functions in Ray-Ban Smart Glasses
The v11 software update includes the latest artificial intelligence functions, such as AI video and live translation. The most recent upgrade from Meta dramatically improves their AI chatbot assistance, letting the Ray-Ban smart glasses process and react to visual inputs in real time.
Smart glasses from Ray-Ban can now convert your voice into four languages: English, French, Italian, and Spanish—all using natural language processing.
Meta claims that while using the Ray-Ban smart glasses with one of these three languages, users will have fast access to translation. They may then listen to the translated discussion in English through the glasses’ open-ear speakers or view it as a transcript on their phone.
In addition, Meta has integrated Shazam, a music identification tool, into the smart glasses. United States and Canadian users will have access to the feature.
Sharing clear, practical insights on tech, lifestyle, and business. Always curious and eager to connect with readers.