top of page

Regulating Social Media Platforms in the Battle Against the Mental Health Crisis

Updated: Mar 15

Concerns about social media aggravating mental illness in youth have been reignited following whistleblower Frances Haugen’s testimony against Facebook to Congress. Haugen testified that Facebook had conducted internal research proving that Instagram worsened the mental health of young adults. The employees’ findings have led mental health professionals to consider how the government can monitor and regulate social media.


An investigation into the leaked documents by the Wall Street Journal found that teens want to spend less time on Instagram but lack the self-control to do so. Teens reportedly felt a large amount of pressure to be present on Instagram, while others described feeling “addicted” to the platform.


Facebook’s internal research also found that teens were aware that what they were viewing was bad for their mental health. In addition, the American Academy of Pediatrics, the Children's Hospital Association and the American Academy of Child and Adolescent Psychiatry have declared the mental health crisis among children and young adults a national emergency. These statements, made by the nation’s top mental health experts, show just how important it is to monitor the consumer experience on social media platforms.


The federal government already regulates the consumption of products that can physically harm the public, such as tobacco products, addictive drugs and alcohol. The dangers of addictive digital content have many Americans wondering when the government will choose to regulate this area, as well. 


As Facebook’s internal research shows, numerous young adults are unable to exercise self-control over their time and choices and are likely not aware that their social media consumption is largely controlled by a supercomputer behind the scenes. Social media needs to be better regulated by the government. 


American policymakers could start by imposing age restrictions and regulations on certain advertising and business practices on social media. In recent years, the European Union has made significant efforts to mitigate the harmful risks associated with social media platforms by adopting such policies.


At the beginning of 2020, the United Kingdom’s Advertising Standards Authority (ASA) banned the use of beauty filters in posts that promote cosmetic and beauty products. The decision forces marketers on the platform to present more realistic images of beauty and helps prevent the false advertising of products. Member states are given the option of adopting a lower age, but it may not be lower than 13 years old.


Concerns about social media worsening the mental health of young people have been reignited in the wake of the recently leaked Facebook documents. In the absence of government action, we are left with a growing teen mental health crisis that has been fueled by the over-consumption of social media.


Policymakers could start by drafting legislation that implements digital age restrictions and limits on advertising and businesses. These regulations are the first step in an effort to return control over citizens’ personal data to the public.


The opinions expressed in this article are those of the individual author.


Sources


“Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show.” The Wall Street Journal, 14 September 2021, https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739. Accessed 28 May 2023.


Shivaram, Deepa. “Children's mental health is now a national emergency, health leaders say.” NPR, 20 October 2021, https://www.npr.org/2021/10/20/1047624943/pediatricians-call-mental-health-crisis-among-kids-a-national-emergency. Accessed 28 May 2023.

Recent Posts

See All

Comments


bottom of page