The European Union has concluded that both Meta and TikTok did not adequately safeguard children on their platforms, potentially leading to fines of up to 6% of their global annual turnover. The preliminary findings indicate that these companies created barriers that hindered reporting and research into child sexual abuse material (CSAM) within their apps.
According to the EU’s investigation, both companies violated child protection regulations outlined in the Digital Services Act (DSA). Notably, they obstructed researchers from accessing essential data to assess children’s exposure to illegal or harmful content. Furthermore, Meta was criticized for complicating the reporting process for users attempting to flag illegal content, employing “dark patterns” that made submissions unnecessarily complex.
Meta’s Legal Challenges in the United States
In a separate but related issue, Meta is grappling with multiple lawsuits filed by various US states. These lawsuits accuse the company of intentionally designing its apps to be addictive, despite awareness of their potential harm to teenagers. Allegations surfaced that Meta’s legal advisors recommended downplaying findings related to teen harm, arguing that the advice should remain confidential due to attorney-client privilege.
The first of these lawsuits is scheduled for a hearing in 2024, and it remains to be seen how the courts will respond to the allegations. As Meta and TikTok prepare to review the EU’s findings and potential repercussions, their responses will play a crucial role in determining whether they will face substantial financial penalties.
This situation highlights ongoing concerns regarding child safety in digital spaces and the responsibilities of social media companies to ensure their platforms are secure for younger users.
