Meta Faces European Commission Investigation over Minors’ Safety
- Martina
- 11 June 2024, Tuesday
The European Commission is turning its attention to Meta platforms, Facebook and Instagram, due to the potential risks they pose to the safety of our children.
Investigation over children’s safety risks
Following formal proceedings against TikTok and a separate probe into its TikTok’s Lite app, the European Commission has also opened formal proceedings against another social media giant, Meta. The platforms Facebook and Instagram will be investigated for potential breaches of EU online content rules regarding minors’ safety.
In their statement, an EU executive expressed concerns that “the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children, as well as create so-called 'rabbit-hole effects.'” Additionally, the EC is concerned about Meta’s age-assurance and verification methods and privacy settings for minors, questioning whether they are working well enough or whether they allow children to access inappropriate content.
The in-depth investigation was initiated by the European Union's landmark Digital Services Act (DSA), which was launched in 2023. Under this Act, tech companies are required to take more initiative to tackle illegal and harmful content on their platforms. Should Meta be found guilty of imposing risk on children and minors, it could potentially face hefty fines.
Ready to get your music out there?
Distribute your music to the widest range of streaming platforms and shops worldwide.