Investigations
Using Ray-Ban Meta Glasses? Someone In Kenya Could Be Watching You Secretly Undress, Having Sex — And The AI Feature Cannot Be Disabled
A sweeping investigation by Swedish newspapers has exposed how intimate footage captured by Meta’s popular smart glasses is being reviewed by Kenyan data workers at Nairobi-based firm Sama — including scenes of users having sex, undressing and using the toilet. The AI assistant that triggers the recording cannot be switched off, and users have no practical control over footage once it enters Meta’s training pipeline.
Every day, in a nondescript hotel building in Nairobi, thousands of Kenyan data workers sit down to a disturbing day’s work. Their job is to annotate video for one of the world’s largest technology companies. But the footage they are required to watch is not advertising material or publicly shared clips. It is the deeply private footage of ordinary people in their homes — people undressing, using the toilet, watching pornography and, in some cases, engaged in sex acts. The people being filmed have no idea they are being watched.
This is the hidden reality behind Meta’s Ray-Ban AI smart glasses, a product selling at record pace across Europe and North America, according to a landmark joint investigation published on February 27 by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten. The investigation, which took the reporters to Nairobi, found that Meta routes video data from the glasses to workers at Sama, a Kenyan data annotation subcontractor, who are paid to watch and label footage in order to train Meta’s artificial intelligence systems. What arrives on their screens is far more than street scenes or landscapes.
The Kenyan workers, speaking under anonymity due to strict confidentiality agreements, described a stream of footage arriving directly from Western homes — content the subjects almost certainly never intended anyone to see. One worker told the Swedish reporters he saw a man set his glasses down on a bedside table and leave the room, only for the man’s wife to walk in moments later and change her clothes, entirely unaware the device was still recording. Others described watching users engaged in sex, using the bathroom and handling bank cards with account numbers clearly visible.
“We see everything — from living rooms to naked bodies. Meta has that type of content in its databases,” one worker told the newspapers.
The AI Feature That Cannot Be Switched Off
At the core of the scandal is a technical reality that Meta does not adequately disclose to buyers. The Ray-Ban glasses — manufactured in partnership with the Italian eyewear giant EssilorLuxottica and priced at around 329 euros — feature a built-in camera that activates the moment a wearer invokes the AI assistant. This footage is automatically transmitted to Meta’s servers for processing. Users who wish to access the AI features have no option to prevent this data transfer. According to the Swedish investigation, the cameras also continue recording even after the glasses are removed from the face, meaning the device captures footage entirely outside the wearer’s awareness.
“In some videos, you can see someone going to the toilet or getting undressed. I don’t think they know, because if they knew they wouldn’t be recording,” one Sama worker told the Swedish journalists.
Meta’s website describes the product as one built “with your privacy in mind” and states that a small LED light illuminates when recording is underway. Critics and privacy specialists, however, say the LED is too small and too dim to function as a meaningful warning in real-world conditions. More critically, the light provides no protection at all in the scenario that is generating the most alarm: the glasses recording a room after the wearer has taken them off.
Privacy Fine Print That Almost Nobody Reads
The investigation also bought a pair of the glasses in Sweden and found that retail staff routinely misinformed customers about how their data was handled, telling buyers that all footage remained locally on the device and was never sent to Meta. This is demonstrably false. The glasses require data — including voice recordings, images and video — to be processed on Meta’s servers. The possibility of human review is disclosed only in Meta’s separate Terms of Use for AI Services, a dense document that the company itself acknowledges few users ever open. That document states that “in some cases, Meta will review your interactions with AIs… and this review can be automated or manual (human).”
Kleanthi Sardeli, a data protection lawyer at the Vienna-based non-profit None Of Your Business (NOYB), which has filed multiple previous lawsuits against Meta, said the situation represents a fundamental transparency failure. “If this happens in Europe, both transparency and a legal basis for the processing are lacking,” she told the Swedish outlets. She warned that once footage is incorporated into AI training models, users effectively lose all practical control over how it is used. “Once the material has been fed into the models, the user in practice loses control over how it is used.”
Petter Flink, a security specialist at the Swedish Authority for Privacy Protection, added that users have virtually no insight into what happens to their data once it leaves the device. He argued that the intimate details of daily life that the glasses capture are, in the long run, far more commercially valuable to Meta than any revenue generated from selling the hardware itself.
Nairobi: The Quiet Engine Room of Silicon Valley’s AI
Sama is not an unfamiliar name in Kenya’s technology labour landscape. The Nairobi-based data services company has previously drawn scrutiny for its content moderation work on behalf of both Meta and OpenAI, with earlier investigations revealing that Kenyan workers were paid between $1.32 and $2 per hour to label depictions of sexual abuse, graphic violence and hate speech. One worker described that experience to investigators at the time as “torture.” The company ended its content moderation work for Meta in 2023, pivoting to computer vision data annotation — precisely the work now at the centre of this scandal.
Workers at Sama describe a workplace designed to prevent the footage from leaking. Personal smartphones are banned. Cameras monitor the annotation floor. Employees who raise concerns about the content they are forced to review are swiftly dismissed. “If you start asking questions, you are gone,” one told the Swedish journalists. The workers feel trapped between the moral distress of watching strangers’ most intimate moments and the economic necessity of holding onto a wage in a city where formal employment is scarce.
The Swedish investigation also found that the automated anonymisation tools Meta relies on to blur faces before footage reaches Kenyan annotators frequently fail. Workers confirmed that the faces of third parties — people other than the glasses wearer — are sometimes clearly identifiable, particularly in footage captured in poor or unusual lighting conditions.
Seven Million Glasses Sold — And the Numbers Are Rising
The scale of the potential privacy exposure is staggering. After selling a combined two million units in 2023 and 2024, sales of Meta Ray-Ban glasses tripled to seven million units in 2025 alone. Each pair sold represents a device now capable of transmitting footage from inside someone’s home — bedroom, bathroom, living room — to a server in California and subsequently to an annotation centre in Nairobi.
The regulatory storm gathering over this revelation is substantial. Members of the European Parliament are pressing the European Commission for clarity on whether the transfer of EU citizens’ data to Sama in Kenya violates the General Data Protection Regulation. There is presently no EU adequacy decision recognising Kenya as offering equivalent data protection, meaning such transfers require additional contractual safeguards. The Irish Data Protection Commission, which oversees Meta’s EU operations from Dublin, has been contacted by investigators and has signalled it is monitoring the situation. Italy’s data protection authority, the Garante, was among the first European bodies to send formal questions to Meta about how the glasses handle personal data.
Separate internal Meta documents, cited by investigators, suggest the company is considering adding facial recognition capabilities to future iterations of the glasses — features the company previously declined to pursue on ethical grounds. Privacy advocates warn that, combined with the existing undisclosed video pipeline, such a development would transform the product into a mass surveillance tool that identifies strangers in real time while uploading the footage for human review.
Kenya’s Data Protection Act and the Questions It Raises
The revelation arrives at an awkward moment for Kenya’s own data protection debate. Under the Kenya Data Protection Act 2019, data controllers processing personal data must obtain informed consent, disclose the purpose of processing and ensure that the rights of data subjects are protected. The Act’s extra-territorial provisions extend its reach to controllers not ordinarily resident in Kenya who are nonetheless processing data relating to Kenyans. Legal scholars at KICTANet, a Nairobi-based technology policy think tank, have argued that the Ray-Ban glasses scandal highlights the growing urgency for Kenya’s Office of the Data Protection Commissioner to develop robust guidance on wearable AI devices.
The episode is also closely linked to a separate controversy that alarmed Kenyans in recent weeks. A Russian content creator, identified as Vyacheslav Trahov, was accused of travelling through Kenya and Ghana while using smart glasses to secretly film women he lured to hotel rooms, then uploading the footage to foreign online forums without their consent. That case sparked outrage about the potential for wearable cameras to be weaponised for voyeurism and covert surveillance. The Swedish investigation now demonstrates that even without deliberate misuse by the wearer, the glasses’ AI pipeline carries its own systemic privacy risks — risks that flow directly into a Nairobi office block.
Meta’s Response: A Referral to the Small Print
The Swedish newspapers spent two months attempting to secure an interview with Meta before publishing their findings. The company ultimately declined, sending a vague response from a London-based spokesperson that referred reporters back to its terms of service and denied specific questions about the nature of the footage being reviewed in Nairobi. Meta has previously stated that it markets the glasses as a product built with privacy in mind and that it gives users control over what is shared and when. The evidence gathered in Nairobi suggests a very different operational reality.
For the Kenyan workers who arrive each morning to annotate footage of strangers’ bedrooms, the disconnect between Meta’s marketing language and the content on their screens is not an abstract regulatory question. It is the texture of their working day. One annotator offered what may be the most precise summary of the entire affair: “You think that if they knew about the extent of the data collection, no one would dare to use the glasses.”
Additional reporting from Svenska Dagbladet, Göteborgs-Posten and The Decoder
Kenya Insights allows guest blogging, if you want to be published on Kenya’s most authoritative and accurate blog, have an expose, news TIPS, story angles, human interest stories, drop us an email on [email protected] or via Telegram
-
News2 weeks agoTHE FIRM IN THE DOCK: How Kaplan and Stratton Became the Most Scrutinised Law Firm in Kenya
-
Grapevine1 week agoA UN Director Based in Nairobi Was Deep in an Intimate Friendship With Epstein — He Even Sent Her a Sex Toy
-
Politics2 weeks agoPresident Ruto and Uhuru Reportedly Gets In A Heated Argument In A Closed-Door Meeting With Ethiopian PM Abiy Ahmed
-
Investigations1 week agoHow Mexico Drug Lord’s Girlfriend Gave Him Away
-
Business2 weeks agoSafaricom Faces Avalanche of Lawsuits Over Data Privacy as Acquitted Student Demands Sh200mn Compensation in 48 Hours
-
Investigations1 week agoHow Close Ruto Allies Make Billions From Affordable Housing Deals
-
Entertainment1 week agoKRA Comes for Kenyan Prince After He Casually Counted Millions on Camera
-
Politics2 weeks agoI Personally Paid For Your Ticket To Visit Raila in India, Oketch Salah Silences Ruth Odinga After Claiming She Barely Knew Him
