EU launches Investigation into Meta's Child Protection Services
EU launches Investigation into Meta’s Child Protection Services

The European Union has recently launched fresh investigations into Facebook and Instagram, suspecting that these platforms may not be effectively protecting children online.

These investigations are part of the EU’s Digital Services Act, a comprehensive set of regulations aimed at ensuring the safety of online platforms and users.

No specific deadline has been provided for the completion of these investigations. If violations are found, companies could face fines of up to 6% of their annual worldwide revenue.

In response to these investigations, Meta, the parent company of Facebook and Instagram, issued a statement reaffirming their commitment to providing safe online experiences for young people.

They highlighted the extensive work they have undertaken over the past decade, developing more than 50 tools and policies specifically designed to safeguard children.

Meta acknowledged that this challenge is faced by the entire industry and expressed their willingness to share details of their efforts with the European Commission.

The European Commission, which serves as the executive arm of the EU, has expressed concerns about the algorithmic systems utilized by Facebook and Instagram to recommend content.

They worry that these systems may exploit the vulnerabilities and lack of experience of children, potentially leading to addictive behaviors.

Furthermore, the commission is examining Meta’s implementation of age verification tools to prevent children from accessing inappropriate content on Facebook and Instagram.

Both platforms require users to be at least 13 years old to create an account. The commission is also assessing whether Meta is adhering to the rules set forth by the Digital Services Act, which demand a high level of privacy, safety, and security for minors.

These cases do showcase the EU’s ongoing focus on child protection through the Digital Services Act. The act requires that platforms implement stringent measures to safeguard minors.

Earlier this year, the commission initiated separate investigations into TikTok, expressing concerns about potential risks to children.

European Commissioner Thierry Breton expressed doubt concerning Meta’s compliance with the obligations outlined in the Digital Services Act in a social media post.

He stated that the commission remains unconvinced that Meta has taken sufficient steps to mitigate the negative effects on the physical and mental health of young Europeans who use Facebook and Instagram.

It’s important to note that Facebook and Instagram are already under investigation for their handling of foreign disinformation ahead of the upcoming EU elections.