Time for better regulation of Social Media

Instagram knows it harms teenage girls

Categories

  • Thought Leadership
September 24, 2021 Anna Beard

Facebook officials had internal research in March 2020 showing that Instagram – the social media platform most used by adolescents – is harmful to teen girls’ body image and well-being but swept those findings under the rug to continue conducting business as usual, according to a Sept. 14, 2021, Wall Street Journal report.

Those who study social media use in teens didn’t need a suppressed internal research study to know that Instagram can harm teens. Plenty of peer-reviewed research papers show the same thing. Our Insights reports on Young Women and Anxiety and Body Image both feature social media as exacerbating negative outcomes for young people.

Teens are more likely to log on to Instagram than any other social media site. It is a ubiquitous part of adolescent life. Yet studies consistently show that the more often teens use Instagram, the worse their overall well-being, self-esteem, life satisfaction, mood and body image. One study found that the more college students used Instagram on any given day, the worse their mood and life satisfaction was that day.

Understanding the impact of social media on teens is important because almost all teens go online daily. A recent survey from Netsafe showed 40 percent of Kiwi teens use five or more social media platforms, while a third of them spend four or more hours online in an average day.

So what needs to be done?

There is much that can be done at the regulatory and industry level:

Regulation of Social Media

The Helen Clark Foundation put out their report Anti-social Media: Reducing The Spread Of Harmful Content On Social Networks in 2019 following the Christchurch terror attack. Two of their recommendations, in particular, have the potential to make a big impact:

  1. Social media companies should no longer be left to monitor and remove harmful content themselves but should be regulated by an independent body.

    If an independent regulator is established by the New Zealand government, they should consider the potential to address harms relating to the promotion of unhelpful or idealised body image online, beyond content related to eating disorders. The new codes of practice should include an expectation that social media companies should improve their practice in relation to how their platforms are used to propagate unhealthy body image through advertising and algorithmic promotion and commit them to ensuring the content they promote to users does not exacerbate body image concerns.
  2. The New Zealand Government could consider imposing a statutory duty of care on social media companies to ensure they take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. In terms of penalties, considering that social media companies are driven by commercial imperatives, we recommend that penalties are set at a level that will incentivise social media companies to combat harmful content online effectively. 

    At the same time, appropriate checks and balances must also be put in place to ensure the right to freedom of expression is not impacted by social media companies being subject to high penalties for non-compliance. This could be achieved by having a wide variety of interests represented on any enforcement body.

Discrimination and stigma

Social media companies should be expected to have clear systems for users to report bullying and discrimination and effective means to take down offending content. Users should have greater control over the content they see and should be able to hide likes and comments, as well as filter content that they consider undesirable.

Share :

Read more like this...