EU Investigates the addictive nature of Facebook and Instagram
The EU is looking at Facebook and Instagram for potential health risks to children
The European Union has initiated an investigation into Facebook and Instagram to look at their addictiveness and possible adverse effects on the physical and mental well-being of children.
This probe not only scrutinises the platforms’ potential to foster behavioural addiction but also questions the efficacy of their content recommendation algorithms and age verification processes.
Given the severe implications, the investigation comes under the rigour of the EU’s Digital Services Act (DSA), which could result in fines of up to 6% of Facebook and Instagram owner Meta’s annual global turnover if infringements are confirmed.
Meta’s response and regulatory challenges
In response to growing regulatory pressures, Meta reported that over the past decade, it has developed more than 50 tools and policies aimed at safeguarding children’s use of its platforms.
In September, as per the requirements of the DSA, Meta submitted a comprehensive report detailing the risks associated with its platforms to the regulators.
This was Meta’s attempt to showcase its commitment to adhering to the new regulations imposed by the EU.
Broader implications for online safety
This is not an isolated investigation of Meta by the European Union.
Meta is concurrently under scrutiny for its handling of political misinformation and the broader impact of its platforms on public discourse.
The issues faced by Meta underscore a significant challenge that extends across the tech industry, reflecting urgent calls for enhanced measures to protect young users from content that could lead to addiction and other negative outcomes.
The “Rabbit-Hole” effect and age verification concerns
The term “rabbit-hole effect” is used by the EU to describe the phenomenon where users, particularly children, are exposed to harmful content, which is then perpetuated by automated recommendations leading to deeper engagement with similar content.
Furthermore, the EU has expressed concerns over how effectively Meta is verifying the ages of its users. The required entry age for using these social networks is 13, yet reports by Ofcom suggest that many users are younger, often with parental knowledge and consent.
Comment by Allen Carr’s Easyway
At Allen Carr’s Easyway, we understand the importance of safeguarding the mental health of all ages. It’s especially important among youngsters who are increasingly exposed to online content designed for adults at a young age.
This investigation by the EU represents a crucial step toward holding tech giants accountable and ensuring they prioritise the welfare of all users, particularly children.
It’s a reminder of the need for constant vigilance and proactive measures to prevent addiction and protect mental health in the digital age.
What are your thoughts on the measures taken by social media platforms to protect young users? Are current regulations sufficient, or is more action necessary? Share your views with us.
If you are looking to improve your relationship with technology and social media, find out more with our How to quit social media & tech addiction program.