Meta's v11 software update for the Ray-Ban smart glasses can now use Live AI, which allows Meta AI to continuously see what users see and provide real-time contextual help through voice commands.
Users can ask questions without using the "Hey Meta" command, and it remembers things you might have referenced earlier in the conversation. Users can interrupt at any time to change topics or ask follow-up questions. It will also learn user patterns and eventually "give useful suggestions even before you ask."
Live translation will translate speech in real-time in English, Spanish, French, or Italian. During interactions with someone speaking another language, users can hear real-time translations from the speakers or view transcripts on a connected phone.
Both live AI and live translations are available to Meta's Early Access program members. Meta says that "these AI features may not always get it right," and the Early Access program is limited to the US and Canada.
Besides these features, Meta also announced integration with the music recognition service Shazam. Users can find the name of a music track by saying, "Hey Meta, what is this song?" in the US and Canada.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News Writer (Romania based)
Details here