Privacy watchdog Noyb sent a cease-and-desist letter to Meta Wednesday, threatening to pursue a potentially billion-dollar class action to block Meta’s AI training, which starts soon in the European Union.
In the letter, Noyb noted that Meta only recently notified EU users on its platforms that they had until May 27 to opt their public posts out of Meta’s AI training data sets. According to Noyb, Meta is also requiring users who already opted out of AI training in 2024 to opt out again or forever lose their opportunity to keep their data out of Meta’s models, as training data likely cannot be easily deleted. That’s a seeming violation of the General Data Protection Regulation (GDPR), Noyb alleged.
“Meta informed data subjects that, despite that fact that an objection to AI training under Article 21(2) GDPR was accepted in 2024, their personal data will be processed unless they object again—against its former promises, which further undermines any legitimate trust in Meta’s organizational ability to properly execute the necessary steps when data subjects exercise their rights,” Noyb’s letter alleged. In a blog, Meta has disputed this, promising, “we’ll honor all objection forms we have already received, as well as newly submitted ones.”
This alleged lack of clarity for users who opt out makes it harder to trust that users can ever truly opt out, Noyb suggested. Previously, Meta “argued (in respect to EU-US data transfers) that a social network is a single system that does not allow to differentiate between EU and non-EU users, as many nodes (e.g. an item linked to an EU and a non-EU user) are shared,” Noyb noted. That admission introduces “serious doubts that Meta can indeed technically implement a clean and proper differentiation between users that performed an opt-out and users that did not,” Noyb alleged.

Loading comments...