“We see everything—from living rooms to naked bodies. Meta has that type of content in its databases,” one of the Kenya-based workers told Swedish newspapers Svenska Dagbladet and Goteborgs-Posten. “Someone may have been walking around with the glasses, or happened to be wearing them, and then the person’s partner was in the bathroom, or they had just come out naked.”
Meta is being sued for privacy violations after an investigation spearheaded by a Swedish newspaper found that images captured by the company’s “smart glasses” are reviewed by workers in an overseas facility.
According to TechCrunch, the investigation found that workers at a Kenya-based subcontractor regularly view pictures and film taken on Meta’s artificial intelligence-enabled glasses. Reviewed footage includes potentially sensitive content, like personal nudity, people engaging in sexual intercourse, and even using the toilet.
Meta says that it blurs faces in images sent for review, but sources have disputed this allegation.
The lawsuit, notes TechCrunch, was filed on behalf of plaintiffs Gina Bartone, who is a New Jersey resident, and Mateo Canu of California. They are being represented by attorneys from the Clarkson Law Firm and allege that Meta violated privacy laws and engaged in false advertising by misrepresenting its privacy practices.
Meta AI smart glasses are, for instance, typically marketed using phrases like “designed for privacy, controlled by you” and “built for your privacy.” Most consumers, therefore, wouldn’t expect intimate footage to be seen by or shown to anyone—let alone workers in an offshore facility.
“We see everything—from living rooms to naked bodies. Meta has that type of content in its databases,” one of the Kenya-based workers told Swedish newspapers Svenska Dagbladet and Goteborgs-Posten. “Someone may have been walking around with the glasses, or happened to be wearing them, and then the person’s partner was in the bathroom, or they had just come out naked.”
“People can record themselves in the wrong way and not even know what they are recording. They are real people like you and me.”
Another employee told the papers that, while reviewing such footage seems like a blatant invasion of privacy, they do it because that’s what they’re paid to do.
“You are not supposed to question it,” the worker said. “If you start asking questions, you are gone.”
The sub-contractors also noted that they review other data collected by smart glasses, including audio samples, which are used for processing and training purposes.
“It can be about any topics at all,” an employee said. “We see chats where someone talks about crimes or protests. It is not just greetings, it can be very dark things as well.”
The lawsuit argues that “no reasonable” consumer would purchase Meta’s smart glasses if they knew the truth about the company’s practices.
“No reasonable consumer would understand ‘designed for privacy, controlled by you’ and similar promises like ‘built for your privacy’ to mean that deeply personal footage from inside their homes would be viewed and catalogued by human workers overseas,” the lawsuit alleges. “Meta chose to make privacy the centerpiece of its pervasive marketing campaign while concealing the facts that reveal those promises to be false.”
Meta has since confirmed that it does use overseas contractors to review collected content, but did not comment on allegations that its privacy protections—like blurring faces—fall far short of what could or should be expected.
“When people share content with Meta AI, we sometimes use contractors to review this data for the purpose of improving people’s experience, as many other companies do,” a spokesperson said. “We take steps to filter this data to protect people’s privacy and to help prevent identifying information from being reviewed.”
Sources
Calif. lawsuit accuses Meta of sending nude video from AI glasses to workers
Meta hit with a class action lawsuit over smart glasses’ privacy claims


Join the conversation!