Ten Maryland school systems are suing several social media giants for causing classroom behavior problems like distractions and limited attention.
Maryland school districts have joined together in a lawsuit against some of the world’s largest social media platforms: Google, Meta, ByteDance, and Snap Inc. Anne Arundel, Baltimore City, Carroll, Cecil, Charles Harford, Howard, Montgomery, Prince George’s, and Talbot Counties have all joined the lawsuit.
According to the Maryland Reporter, the lawsuit accuses the social media platforms of “targeting and manipulating youth so they stay engaged for excessive amounts of time.” Maryland teachers reported to the local media outlet that “(Social media is) a huge distraction in our schools and in our classrooms” and “personal devices get in the way,” despite teachers doing their part to plan “effective, instructional uses for technology in the classroom.” One instructor said the most significant problem plaguing his students is “self-control and their failure to regulate personal behavior.”
Furthermore, when schools try to regulate personal devices and social media use, “students have gotten more angry and aggressive. … In a developmental part of their lives, they often fail to pause for a moment and look for resources or constructive ways to express their emotions.” Additionally, Maryland educators see more physical altercations because of conflicts that started on or over social media. Two staff members at St. Charles High School in Charles County were recently injured after intervening in a physical altercation between students that started over a social media post.
Not surprisingly, school officials in Charles County and elsewhere are alarmed at how social media affects child development and mental health. That concern is also reflected in the lawsuit, which alleges that companies are aware of the negative impacts the platforms have on children’s mental health, but they choose to prioritize profit.
Charles County educators said social media contributes to the mental health crises that some young people endure. Inappropriate social media interactions continue to create disruptions in the buildings all over the county because children these ages are not developmentally ready for the responsibility…
The isolation of school closures during the COVID-19 pandemic and the deepened use of social media during that time are also top-of-mind for Marylanders:
Moreover, the COVID-19 pandemic further isolated children and left many in a more fragile mental state. According to the Centers for Disease Control, in 2021, suicide was the second leading cause of death for people ages 10 to 14.
The debate on privacy vs safety
Many public health specialists have connected the youth mental health crisis to social media, with research showing a correlation to increased levels of anxiety and depression in children. Safety elements arise in protecting children’s personal data from exposure and shielding them from adults looking to prey on minors. However, there is also a litany of privacy and data concerns to the “social media problem.” According to a Route-Fifty article, privacy concerns, civil liberties, and technological capabilities are still in question when lawmakers try to put new protections in place.
Essentially, industry oversight tactics can’t get off the starting block because there is no standardization or regulation yet for the technology and process that can safely and effectively verify the age of a minor voluntarily sharing their personal data with the program. Many companies use uncertified, third-party agencies with several different strategies for verification. In essence, the existing process hinges on collecting even more data from the underage user and sharing it with an unregulated, uncertified entity that can use whatever methods they choose. This is a process that understandably arouses skepticism, but a researcher from the article points out that there might be a need to accept some uncertainty.
Even a recent bill in Maryland can help convey the problem of trying to protect the child’s data by eliminating an option to protect the child’s experience on the platform. Lawmakers in Annapolis considered legislation just this past session at the beginning of the year, with the introduction of House Bill 901 by Delegate Jared Solomon. The bill would have required companies offering an online product likely to be used by minors to assess and disclose information regarding their procedures for protecting the personal data of children. If the bill had passed, it would likely have created significant problems for a safety provision, like in other states, that are, as a standard, attempting to verify the age of a minor through an unregulated third party.
A recent policy paper cited in the article summed up the situation by simply stating that “no perfect solution exists,” but new insights on the issue continue to pile up.