Two college students walk into a Boston subway station in October 2025. One is wearing a pair of Ray-Ban Meta smart glasses — the kind sold at any mall, priced at $299, advertised as a hands-free camera for capturing life moments.
The glasses livestream to an AI pipeline on a laptop in a backpack. A stranger boards the train. Within seconds, the laptop returns a name, a home neighborhood, a phone number, an employer — all pieced together from publicly available data.
The stranger never knew they'd been identified.
The students — Harvard undergraduates AnhPhu Nguyen and Jason Luo — called their project "I-XRAY." They had not built anything novel. They had simply connected three existing, freely available products: Meta's $299 Ray-Ban smart glasses, a commercial facial recognition search engine called PimEyes, and a large language model to stitch the results into a dossier. Total cost to replicate: under $300 and a weekend.
Nguyen and Luo did not release their code. They published findings as a warning.
Their warning has largely gone unheeded.
What the Glasses Actually Do
Ray-Ban Meta smart glasses — a collaboration between Meta Platforms and EssilorLuxottica, released in 2023 and updated in 2024 — are, on the surface, ordinary-looking eyewear. They include a 12-megapixel camera in the frame, microphones, speakers, and a connection to Meta's AI assistant. A small LED light is supposed to indicate when the camera is recording, though the LED can be obscured or dimmed.
The glasses sold over a million units in 2024, according to Meta CEO Mark Zuckerberg. They're available in multiple styles, including Ray-Ban's classic Wayfarer and Headliner frames — designed to look indistinguishable from ordinary glasses.
By themselves, they are simply a wearable camera. The privacy concern emerges at the pipeline level.
The Pipeline: Three Steps to a Dossier
The I-XRAY methodology Nguyen and Luo demonstrated works like this:
- Capture: The Ray-Ban Meta glasses stream or record a face via their front-facing camera. The livestream feature, added in a 2024 update, allows real-time video to be sent to a connected device.
- Search: A screenshot of the face is uploaded to PimEyes or FaceCheck.ID — commercial facial recognition services that scan public photos indexed from social media, news sites, company websites, and other open sources. Both services are legal and publicly accessible, costing as little as $30/month for subscription access. PimEyes returns links to images where that face appears online.
- Aggregate: A large language model (the researchers used GPT-4o) then takes the image links and associated metadata, cross-references them with voter registration databases, LinkedIn, white-pages data brokers, and other public records, and synthesizes a profile: full name, employer, address, phone number, and social connections.
The entire process takes under 60 seconds.
In tests documented by the researchers, they were able to identify over 30 strangers they had never met — in subways, on the street, in cafes — with a success rate they described as "alarmingly high."
Why This Is Different from a Phone Camera
Smartphones already have cameras. Facial recognition already exists. So why does this matter?
The answer is in the friction.
Pointing a phone at someone's face to photograph them is visible, awkward, and socially noticeable. Smart glasses are not. They look like eyewear. There is no phone to raise, no shutter sound, no behavioral tell. The wearer can make eye contact with a stranger, hold a natural conversation, and be running a background identification scan simultaneously.
This eliminates the social friction that has historically acted as a soft deterrent to surveillance. It also fundamentally changes what "public space" means. Walking through a city has always meant trading some privacy — you can be seen. It has not historically meant trading your identity, home address, and employment history to anyone who glances at you.
The ACLU has described this as a qualitative shift: "The difference between being seen and being identified is the difference between public presence and public profiling."
The Legal Vacuum
There is no federal law in the United States that prohibits real-time facial recognition of individuals in public spaces. None.
The existing legal framework around facial recognition is a patchwork of state-level biometrics laws — most notably Illinois' Biometric Information Privacy Act (BIPA), the strongest in the country — and regulations that target specific sectors (law enforcement use, federal agencies) rather than private individuals.
BIPA, passed in 2008, requires companies to obtain informed written consent before collecting biometric data including face geometry. But it regulates companies, not individuals. A private person using PimEyes and Ray-Ban glasses to identify strangers is not a company collecting and storing biometric data in the BIPA sense — they're a person looking at publicly available information.
The FTC has brought actions against facial recognition misuse, but only in specific commercial contexts. The Electronic Frontier Foundation has called for federal legislation multiple times; none has passed.
Meta, for its part, says the Ray-Ban glasses' privacy LED and terms of service prohibit misuse. PimEyes says it prohibits using its service to track individuals and requires users to agree not to do so. Both forms of protection are unenforceable in practice.
Who's Actually Using This — and for What
The Nguyen/Luo demonstration in October 2025 was framed as a proof of concept. But the underlying tools — the glasses, the facial recognition services, the AI aggregation — are all commercially available, legally purchasable, and have been in use by individuals and small operators for months.
Documented cases have included:
- A man in New York identified and followed a woman he'd seen on a subway using a variant of the pipeline, subsequently charged with stalking.
- Investigative journalists using the technology to identify undercover police officers at public protests.
- A private security firm offering "individual background vetting" services using smart glasses at corporate events.
- Online communities sharing techniques for identifying political opponents, ex-partners, and public figures in private contexts.
The stalking case is the clearest illustration of the harm spectrum. The journalism case illustrates the civil liberties tension: the same technology that threatens privacy also enables accountability journalism. The law doesn't yet distinguish between them.
What Meta Has and Hasn't Done
After the I-XRAY research went public in October 2025, Meta issued a statement saying the use of its glasses for surveillance purposes violated its terms of service. The company pointed to the camera indicator LED as an existing safeguard.
What Meta did not do: disable the livestream feature, add technical restrictions to prevent face-capture use cases, introduce any form of consent-request mechanism, or modify the glasses' hardware or software to reduce surveillance utility.
Meta's Ray-Ban line is one of the company's fastest-growing hardware products. Zuckerberg has described wearable AI as the future of computing. The commercial incentive to preserve the glasses' full functionality is significant.
PimEyes and FaceCheck.ID similarly issued statements condemning misuse. Neither has implemented meaningful technical barriers to the pipeline Nguyen and Luo described.
Why This Is a Bigger Problem Than It Appears
Smart glasses with embedded cameras are not a niche product. Apple has an AR headset. Google has had multiple iterations. Snap makes camera glasses. Dozens of Chinese manufacturers produce similar hardware. The Ray-Ban Meta product is specifically notable for looking ordinary — but it is the leading edge of a consumer category that is expected to grow rapidly.
NIST's face recognition vendor testing data shows that commercial facial recognition algorithms have achieved accuracy rates above 99% on high-quality frontal images as of 2024, with error rates continuing to drop. The technology is improving faster than policy frameworks are responding.
The result is a slow normalization. Each year, more people wear camera glasses. Each year, facial recognition gets cheaper and more accurate. Each year, the data aggregation infrastructure gets richer and more cross-linked. The pipeline that Nguyen and Luo demonstrated with $300 of hardware will soon be executable without any specialized setup at all — it will simply be a feature available to anyone wearing ordinary-looking eyewear.
At that point, the concept of anonymous presence in public space — the ability to walk through a city, sit in a coffee shop, or ride public transit without being identified — becomes a historical artifact.
We're not there yet. But the Drudge headline calling this the "RAY-BAN META Creep" is not wrong. The creep is the point: it happens gradually, then suddenly, until the old normal is irreversible.
- Ray-Ban Meta smart glasses retail for $299 and have sold over 1 million units as of 2024.
- The I-XRAY pipeline (glasses + facial recognition + AI aggregation) can identify a stranger's full name, address, and employer in under 60 seconds using publicly available tools.
- There is no federal U.S. law prohibiting private individuals from using real-time facial recognition in public spaces.
- Illinois' BIPA (2008) is the strongest U.S. biometrics privacy law — it covers companies, not individuals.
- Commercial facial recognition accuracy now exceeds 99% on high-quality frontal images (NIST, 2024).
- Meta has not modified the glasses' hardware or software in response to surveillance demonstrations.