Meta has rolled out an update for Ray-Ban smart glasses, expanding their functionality with several new features. These improvements aim to enhance the user experience by adding real-time translation capabilities, Shazam service support, and more. This update makes the glasses even more useful for everyday life.
Enhanced User Interaction with AI
A major highlight of the update is the improved “Live AI” function, which enhances user interaction with the intelligent assistant. The AI is powered by built-in cameras that allow it to “see” what the user sees. This helps the assistant answer questions based on the visual context. For example, if you’re exploring an unfamiliar city, you can ask the glasses about a particular landmark while looking at it, and the AI will provide you with information in real time.
Real-Time Translation and Shazam Integration
Another key feature is the addition of a real-time translation tool, adds NIX Solutions. This tool supports four languages: English, Spanish, French, and Italian. When speaking with someone in another language, the glasses will translate the conversation on the fly. The translation can either be voiced through the built-in speakers or displayed as text on a smartphone screen that’s synced with the glasses. Users will need to manually select the languages they wish to translate.
The glasses now also feature support for the Shazam music recognition service. Users can activate it by saying the voice command, “Hey, Meta, what song is this?” This functionality is already available to users in the US and Canada, and it will be rolled out to other regions through firmware updates.
What’s Next?
These updates significantly improve the Ray-Ban smart glasses, making them more versatile and functional for everyday use. As the technology continues to evolve, Meta is working on additional features, and we’ll keep you updated as more integrations become available.