Two Harvard students have created a chilling demonstration of how smart glasses can use facial recognition technology to instantly decipher people’s identities, phone numbers and addresses. The most disturbing part is that the demonstration uses currently widely used technology, such as Ray-Ban Meta smart glasses and public repositories.
AnhPhu Nguyen, one of the two students, posted a video showing the technology in action, which was later 404 media. The technology is called I-XRAY, and it works by leveraging the ability of Meta smart glasses to live stream videos to Instagram. A computer program then monitors the stream and uses artificial intelligence to identify faces. The photos are then entered into public databases looking for names, addresses, phone numbers and even relatives. This information is then fed back through the mobile app.
In the demonstration, you can see Nguyen and Caine Ardayfio, another student behind the project, use the glasses to instantly identify several classmates, their addresses, and the names of relatives. Perhaps even more chillingly, Nguyen and Ardayfio were also shown chatting with complete strangers on public transportation, pretending they knew them based on information gleaned from technology.
Facial recognition technology has been very accurate for some time, and I-XRAY is very much just a bunch of existing technologies tied together. It partially relies on PimEyes, new york times 2022 is described as an “extremely accurate” face search engine that “anyone can use.” Since news broke of Clearview AI using facial recognition to help law enforcement, concerns about the technology have intensified. The novelty of Nguyen and Ardayfio’s demonstration is how the technology can be combined with discreet and easily accessible consumer electronics.
“This tool was not built for abuse, and we will not release it,” Nguyen and Ardafiyo wrote in a document explaining the project. Instead, the students say their goal is to raise awareness that this isn’t a dystopian future — it’s possible with existing technology. In particular, they note that I-XRAY is unique because a large language model (LLM) enables it to work automatically to draw relationships between names and photos from a large number of sources.
Privacy has always been a major concern for smart glasses. Google Glass initially failed in part because of public backlash against recordings in public places without consent. However, it’s also true that in the decade since, thanks to the rise of smartphones, video bloggers and TikTok, people have become more accustomed to being filmed. The disturbing thing about modern smart glasses, however, is that they aren’t nearly as eye-catching as Google Glass.
The Ray-Ban Meta glasses used in this demonstration look like any other Ray-Ban glasses. While this is critical for the adoption of smart glasses, it also makes it harder to time when someone has a camera on their face. The Meta glasses do include a privacy light that automatically turns on whenever you record video. However, in our testing we found that it’s hard to notice the light when you’re outdoors in bright light, and people often don’t notice when you’re shooting, especially in crowded public places.
For its part, Meta warns users not to be glass-vulnerable in its Ray-Ban glasses privacy policy. It urges users to “respect people’s preferences” and to clearly gesture or use voice controls when filming videos, live streaming or taking photos. However, the reality is that people may choose not to adhere to wearable etiquette, no matter what Meta says. edge Contact us for advice. Meta responded to our email by citing its terms of service, which reiterated the same guidance.
It’s a sobering reminder that smart glasses can be misused, but there are steps people can take to protect themselves. In their documentation, Nguyen and Ardafiyo outline reverse face search and people search repositories that allow you to opt out. Even so, remember that it’s almost impossible to completely delete your online presence – you can only make your information less available.
Updated October 2: Added response from Meta.