EU Investigates Meta Over Child Safety Concerns on Facebook and Instagram

EU Investigates Meta Over Child Safety Concerns on Facebook and Instagram
x
Highlights

The European Union launches a formal investigation into Meta's Facebook and Instagram, probing potential breaches of child safety rules.

The European Union has initiated a formal investigation into Meta, the parent company of Facebook and Instagram, citing concerns regarding the protection of children's mental and physical well-being on these social media platforms. The investigation, announced by the European Commission, focuses on assessing whether Meta has violated rules outlined in the EU's Digital Services Act (DSA).

Specifically, the probe will scrutinize Facebook and Instagram's user interface (UI) and algorithms, examining their potential to induce "behavioural addictions" and create "rabbit-hole effects" among children. Additionally, the EU raises concerns about Meta's efforts to prevent minors from accessing inappropriate content and questions the effectiveness of its age-verification tools.

Furthermore, the investigation will evaluate the adequacy of Meta's content recommendation systems and default privacy settings in ensuring the privacy, safety, and security of minors. Despite Meta's recent initiatives to enhance child safety on its platforms, such as restricting access to harmful topics and limiting interactions with suspicious adult accounts, the EU remains vigilant in its assessment of the company's compliance with DSA regulations.

The Commission will proceed with gathering additional evidence as part of the investigation, with the possibility of interim enforcement action against Meta. If found in violation of DSA rules, Meta could face fines of up to six per cent of its global revenue. EU Commissioner Thierry Breton underscores the importance of protecting youth, affirming the EU's commitment to safeguarding children online.

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS