Commission's Preliminary Findings Detail Systemic Failures in Age Verification
The European Commission has issued a preliminary finding that Meta Platforms is in breach of the European Union's Digital Services Act (DSA). The accusation centers on Meta's alleged failure to implement effective measures preventing children under the age of 13 from accessing its platforms, Facebook and Instagram. The investigation, which commenced in 2024, points to significant loopholes in Meta's systems, allowing minors to bypass age restrictions with relative ease.
The core of the Commission's concern lies in the inadequate enforcement of Meta's own stated minimum age requirement of 13 years for its services. Reports indicate that during the account creation process, children can easily provide a false birth date, circumventing safeguards. The Commission states there are no effective controls in place to verify the accuracy of self-declared birth dates, nor are there sufficient follow-up measures for reported underage accounts.
Read More: Motorola Razr Fold Pre-Order April 13, Sale May 21
Meta Faces Scrutiny Over Platform Design and Enforcement
The Commission's investigation highlights that Meta's terms and conditions, which stipulate a minimum age of 13, are not being adequately upheld through concrete actions. This suggests a disconnect between stated policies and actual platform operation. "The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users – including children," stated an executive vice president at the European Commission.
Meta now has an opportunity to review the Commission's findings and present a defense. The potential repercussions for Meta could include substantial fines, depending on the final determination of the breach. This development occurs as regulatory bodies globally, including the UK, have been urging social media companies to strengthen child protection measures.
Broader Context: Global Moves Towards Online Child Safety
This EU action comes amidst a growing international movement to address the safety of young users online. Recent discussions and actions by governments worldwide underscore a heightened awareness of the potential harms associated with social media for minors. For instance, Australia has recently enacted a ban on social media use for individuals under 16, a move that has reportedly spurred further consideration of similar measures by other nations.
Read More: EU: Phones must have easy-to-change batteries from 2027
The European Commission is also exploring the possibility of a bloc-wide age limit for social media, a move influenced by these global trends and the ongoing pressure from EU member states. This is not the first time Meta has faced scrutiny under the DSA; the Commission had previously accused TikTok of similar issues regarding the addictive design of its platform and its impact on teenagers. The EU has also been developing its own age verification solutions, with an app reported to be ready for deployment.