A coalition of over 70 civil rights, domestic violence, reproductive rights and LGBTQ+ organizations, including ACLU, Fight for the Future, Access Now and more, have sent a letter to Meta CEO Mark Zuckerberg demanding that the company remove a purported facial recognition feature on its Meta Ray-Ban smart glasses before it ever reaches consumers.
According to a Wired report, the feature, internally called “Name Tag,” would allow wearers to point their glasses at a stranger and use Meta’s AI assistant to retrieve information about them. Engineers are reportedly weighing two versions: one that identifies people you’re already connected to on meta platforms, and a more comprehensive version that could detect anyone with a public Facebook or Instagram account.
The civil rights group argues that no amount of design changes or opt-out mechanisms can make this feature safe. Bystanders on the street have no way to consent to identification, and the coalition says the technology could be weaponized by stalkers, perpetrators and federal law enforcement.
Why is the timing so suspicious?
What makes this story particularly worrisome is a leaked internal meta-memo from May 2025. As the NY Times reported, the company reportedly stated that the launch was planned in a “dynamic political environment” in which civil society groups were turning their attention elsewhere. The Coalition has rightly described this as “abhorrent behaviour”.
Meta Ray-Bans were already in trouble when an investigation revealed that the smart glasses were sending video recordings of users’ most personal moments for AI training. The new facial recognition feature is another major blow to the privacy of its customers.
Should you be worried?
If you own Ray-Ban Meta glasses, the existing hardware can secretly record videos. Adding facial recognition would theoretically mean that anyone you walk past could be silently identified and assigned a trail of personal information, and other Meta Ray-Ban users could do the same to you.
I don’t have high hopes for privacy and security from a company like Meta, but it’s really uncharted territory and could potentially cause physical harm to people in the real world.
Meta responded that it doesn’t currently offer this feature and would take a “very considered approach” before rolling anything out. It remains to be seen whether this promise will be kept.




