The European Union says it’s not convinced Meta has “done enough” to protect children from the “negative effects” of social media.

The EU launched an investigation this week into the systems Facebook and Instagram deploy to verify the ages of minors and serve recommended content to them.

Meta says it uses more than “50 tools and policies” to protect children online.

But EU Commissioner Thierry Breton said he was “not convinced” that the measures were robust enough.

Fellow commissioner Margrethe Vestager also highlighted concerns about how Meta’s apps may be stimulating “behavioural addiction.”

Facebook and Instagram already have age verification methods to prevent people younger than 13 from signing up.

However, a recent report by Ofcom found scores of pre-teens still use accounts, often without their parent’s knowledge.

A statement from the EC noted: “In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.”

The continental bloc will now scrutinise the platform against the recently introduced Digital Services Act (DSA).

The DSA is a strict set of regulations enacted in 2022, designed to create a “safer online environment for digital users.”

Big tech companies are now forced to comply with the DSA. If they don’t, they risk significant fines of up to 6% of their annual global turnover.

The EC says it will now look into whether Meta is meeting its obligations in terms of mitigating the risks its apps pose to the “physical and mental health” of children.

This isn’t the only case Meta faces; the EU is also investigating whether the megacorp is doing enough to tackle political disinformation.