The European Union has found that Meta Platforms is failing to effectively prevent children under 13 from accessing its platforms, Facebook and Instagram, potentially exposing them to harmful or inappropriate content.
Gatekeepers News reports that in preliminary findings released on Wednesday, EU regulators said Meta may have breached digital content rules and directed the company to “strengthen” its systems for detecting, preventing, and removing underage users.
The move comes as the bloc intensifies efforts to protect minors online, with several member states considering social media bans for users under 16. The EU is also weighing the possibility of introducing a unified age limit across the bloc, amid growing global pressure following Australia’s recent ban on social media access for under-16s.
Under Meta’s own policies, users must be at least 13 years old to access Facebook and Instagram. However, the EU said its investigation found the company’s enforcement measures to be ineffective.
“Terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users — including children,” said Henna Virkkunen.
If the findings are upheld, Meta could face a fine of up to six percent of its global annual turnover.
Meta rejected the preliminary conclusions, insisting it already has safeguards in place.
“We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age,” a company spokesperson said, adding that the firm would continue engaging with EU authorities.
Ongoing Probe Under Digital Rules
The findings stem from an ongoing investigation launched in May 2024 under the Digital Services Act (DSA), part of the bloc’s broader push to rein in Big Tech.
Regulators said children could easily bypass age restrictions by entering false birth dates, noting that Meta has “no effective controls” to verify users’ ages. They also criticised the platform’s reporting tools, describing them as difficult to access and ineffective, requiring multiple steps to flag underage users.
The EU further stated that Meta had “inadequately” assessed the risks posed to children and underestimated their exposure to age-inappropriate content. According to Brussels, existing evidence suggests that between 10 and 12 percent of children under 13 are able to access the platforms.
Meta could still avoid penalties if it addresses the concerns and implements adequate corrective measures.
Wider Concerns Over Online Safety
The investigation is part of a broader EU effort to tackle online risks, particularly those affecting children. In February, regulators issued a warning to TikTok over its “addictive design,” threatening sanctions if changes were not made.
Authorities are also examining how Meta safeguards users’ physical and mental wellbeing, including concerns over potentially addictive features on its platforms.
Meanwhile, the EU recently announced that a new age-verification app is ready for rollout. The tool is designed to replace simple pop-up confirmations used by adult websites, which regulators say are easily bypassed.
In a related development, the bloc last month found that several adult platforms, including Pornhub, were allowing minors to access explicit content in violation of digital rules.


