HARFORD COUNTY, Md. — Some Maryland Public School systems have filed a lawsuit against social media companies Meta, Google, ByteDance and Snap Inc. alleging their addictive products increased the youth mental health crisis for its student bodies.
Harford, Howard, Montgomery and Prince George's Counties are joining 500 school systems across the country alleging that students are facing a mental health crisis due to the social media products designed to target and addict children.
The lawsuits also say these products are harming students, damaging mental health and increasing burdens for school districts.
The social media companies promote their platforms: Instagram, YouTube, Snapchat and TikTok.
The lawsuits claim the algorithms driving these platforms are designed to exploit young users' brains in a way comparable to nicotine to manipulate users into staying on the platform as long as possible.
School districts allege these social media companies have known about these negative impacts but have continued to prioritize profit over the well-being of children.
As a result, schools are unable to keep up with the mental health service demand. Through this lawsuit, the Board is seeking to change the way "the platforms exploit teens and obtain funds to address this crisis from those responsible, rather than continuing to place that burden on taxpayers."
“This lawsuit seeks two things: force social media companies to make changes to their platforms for the well-being of our kids and hold these mega-social media companies accountable for the high costs associated with addressing the mental health problems impacting our students,” said Dr. Carol Mueller, President of the Board of Education of Harford County. “Schools across the country, just like here in Harford County, are struggling to keep up with student needs while also providing high-quality education and a good learning environment. We need the support and long-term funding to remove the financial burden from taxpayers and instead place it on the companies substantially contributing to and benefiting from this crisis.”
WMAR-2 News reached out to social media corporations for a response and here's what we got:
Protecting kids across our platforms has always been core to our work. In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in these complaints are simply not true.
Meta also gave us a response:
We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online. We’ve developed more than 30 tools to support teens and their families, including tools that allow parents to decide when, and for how long, their teens use Instagram, age verification technology, automatically setting accounts belonging to those under 16 to private when they join Instagram, and sending notifications encouraging teens to take regular breaks. We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us. These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.
Meta also listed some of the tools they've created to foster supportive and positive experience for teens:
- Teens are shown notifications to take regular breaks from Instagram.
- Teens are notified that it might be time to look at something different if they've been scrolling on the same topic for awhile.
- Teens are given the option to turn on hidden words for comments and DMs. Once on, comments and DMs containing emojis, words or phrases selected by the user will be hidden.
- Meta uses age verification technology to help teens have experiences that are appropriate for their age.