Meta Faces EU's Digital Services Act Breach for Child Safety
The European Commission's recent preliminary findings have raised serious concerns about Meta's compliance with the Digital Services Act (DSA) regarding child safety. In a historic move, the Commission has indicated that Meta has failed to effectively prevent underage children from accessing its widely used platforms, Facebook and Instagram. This marks the first application of such a charge to a mainstream social media company, previously reserved for adult content sites.
Despite enforcing a minimum age of 13, Meta's reliance on self-declaration for age verification has come under scrutiny. An independent study conducted by the Interface-EU think tank in 2025 revealed that children could easily create accounts by simply entering a false date of birth, highlighting the ineffectiveness of Meta's current verification processes. Furthermore, the tools available for reporting underage accounts are cumbersome, requiring several steps that dissuade users from taking action.
Comparing Age Verification Approaches: Lessons from Adult Platforms
Interestingly, the European Commission's findings are reminiscent of actions taken against adult content sites earlier this year. Platforms like Pornhub and XVideo were similarly charged for failing to restrict underage access. The Commission articulated a clear expectation that all platforms, including giant social media networks like Meta, must adopt robust, age-appropriate measures to protect young users.
Following the launch of a new EU age verification app, built on advanced zero-knowledge proof technology, pressure is mounting on Meta to eradicate its previous justifications for inadequate measures. EU Commission President Ursula von der Leyen has expressed a strong commitment to safeguarding children's rights online, stating no excuses will be accepted from companies failing to comply with these standards.
The Broader Implications for Social Media Companies
This scrutiny of Meta comes at a time when global conversations around child safety on social media are intensifying. Australia has pioneered a blanket ban on social media access for users under 16, prompting similar considerations in other countries like the UK and Spain. As public awareness grows regarding the mental health implications of social media, the pressure is on platforms to innovate and implement more secure age verification methods.
As the European Commission prepares to finalize its findings, the consequences for Meta could be severe, with potential fines amounting to 6% of its global revenue. While Meta continues to assert its commitment to child protection, the substantial gaps in their approach may soon catch up with them. The tech industry at large will be watching closely to see whether Meta can pivot to implement more stringent safety measures, or if it will face the repercussions of inaction.
Write A Comment