If you do a quick online search for Ray-Ban Meta smart glasses right now, you’ll find that this wearable device is primarily known for its fast photo capture and live streaming capabilities.
However, Meta founder and CEO Mark Zuckerberg didn’t comment much on the photos and videos during a demonstration of the Ray-Ban Meta smart glasses section at the Meta Connect 2024 event on Wednesday.
Everything announced at Meta Connect 2024
In fact, Zuckerberg launched Ray-Ban Meta smart glasses mainly as artificial intelligence devices.
“Glasses are a new category of artificial intelligence devices,” Zuckerberg said, noting that his company has just caught up with consumer demand for Meta smart glasses as sales are growing faster than he expected.
Apart from the new limited edition Ray-Ban Meta smart glasses device with a transparent frame, Meta has not released any new smart glasses hardware.
Credit: Yuan
However, Zuckerberg did share some new features that he said will be coming to the Meta smart glasses in a series of updates released in the coming months – all of which are related to artificial intelligence.
Meta AI is already integrated into Ray-Ban Meta smart glasses, just like other companies’ voice assistants are integrated into their devices. However, the new update will make these interactions “more natural and conversational,” according to Zuckerberg.
Mix and match speed of light
Mix and match speed of light
“Hey Meta” instead of “Look and tell me.”
For example, currently, when users have a question, their Ray-Ban Meta smart glasses must prompt them with the phrase “Look and tell me.” Zuckerberg’s demo showed that users will no longer have to do this. Users only need to activate the feature through the “Hey Meta” prompt and then ask a question. Meta AI automatically knows that the problem is related to what the user sees through the glasses.
Additionally, after the initial “Hey Meta,” Meta AI will no longer require users to begin each prompt with that phrase. Meta AI will be able to continue to interact with users.
Instant translation on Ray-Ban Meta smart glasses
The latter function is similar to the translation function of other smart glasses. When conversing with another person, users can access real-time audio translation in another language through the glasses. The demo seems to run almost perfectly on Meta Connect when translated from Spanish to English and English to Spanish.
Credit: Yuan
Multimodal AI prompts
Zuckerberg explained the multi-modal video AI capabilities by showing a demonstration of users trying on clothing. Through this feature, Meta AI is able to provide fashion suggestions and recommendations based on the user’s outfit and their specific concerns.
Ray-Ban Meta smart glasses will soon also be able to automatically remember things for the user. The example shown at Meta Connect involved Meta AI recalling the parking space number of the user’s parking location. The user does not need to prompt Meta AI to perform this operation. It looks like it naturally remembers the number because the user is looking through the glasses.
Credit: Yuan
In addition, Ray-Ban Meta smart glasses also have similar Meta AI functions. Users will soon be able to view flyers or advertisements and ask the smart glasses to call a phone number or scan the relevant QR code. The glasses can also automatically remember what they saw through the glasses if the user wants to come back to them later.
Other updates to the Ray-Ban Meta smart glasses include the ability to voice control Spotify and Amazon Music through the device, as well as new integration with apps like Audible and iHeartRadio.
Partnering with Be My Eyes to serve blind and partially sighted users
Credit: Yuan
Meta also announced a partnership with Be My Eyes, a mobile app that connects blind and partially sighted people with volunteers via real-time video to discuss the situation before them. The app will run directly through Ray-Ban Meta smart glasses, and volunteers will be able to see through the user’s glasses to provide assistance.