A pair of recent landmark court decisions are at the center of the national conversation surrounding social media companies, the addictive nature of their products, and their impacts on children’s mental health.
Last week, a New Mexico jury found Meta, the company that owns Facebook and Instagram, knew about the harmful effects of their products on children.
In California, a federal jury awarded millions of dollars in damages to a woman who claimed using platforms Instagram and YouTube — which is owned by Google — as a child led to her mental health problems.
A day after the decision in California, WAMC’s Lucas Willard spoke with Sarah Domoff, a professor of psychology at the University at Albany and a leading researcher on children’s and adolescent social media use, about the significance of the decisions.
In my work with children and teens and families, there's so much overwhelm and stress regarding how to navigate the digital age, specifically managing social media platforms and experiences that youth have on social media. So, it's very overwhelming for a lot of families to navigate this. And in addition to the work that we do to build interventions to help teens navigate social media safely, we recognize that it can't just fall on the teens or the parents themselves, that tech companies also have a responsibility. And so, these cases reiterate that that that premise, that we think that there are multiple avenues of intervention that must occur to keep children safe online.
And the gist of it is, is that the people who were harmed were exposed to social media at a young age and also used these platforms compulsively.
Yeah. So, the piece that sticks out regarding the case that happened yesterday in Los Angeles is about the design features, so that there are features of the apps themselves that make it really hard to turn them off, or, you know, turn away from them. And as you can imagine, for adolescents, for developing minds, that's extra reinforcing. And so, there's this design features that include, you know, those notifications that you receive, as well as unending feeds, but also, when we look at the links between social media and mental health, it's the content. And so, when we look at the algorithms and the content that's pushed to teens and to adults, frankly, if it's harmful, those with mental health vulnerabilities could be at greater risk, and that's kind of what the lawsuit was showing.
What are some of the negative impacts that you might have seen in your professional life from social media addiction?
So, there are multiple ways that social media can create risks for youth, and it really falls down to this content piece and what we also call the context of use. So, content meaning seeing content that may be triggering or exacerbating underlying mental health concerns is one piece. So, there's been some really interesting research that's shown that individuals who have like histories of disordered eating, or have eating disorders have, are more likely to be shown content that promotes that. And so it may actually be more harmful for some individuals because of these preexisting concerns, specifically content that may exacerbate underlying mental health concerns. But then also, we know that it's not just the content that one sees. It's also how hard it is to put it away, you know? And so that's where that compulsive or problematic, what some folks call addictive-like use, includes where even though we want to cut back, we can't, and it's interfering with functioning. And for teens, especially, we see that link around nighttime, nighttime use interfering with sleep, really getting in the way of their sleep health and then subsequent days functioning.
Do you agree with the assessment that social media has taken away people's ability to become bored? Or deal with boredom, accept and deal with boredom?
That's a really great question. I think that some of the short form videos and the features of the whether it's social media or other streaming services, make it so that there's seemingly endless content to consume. With that being said, when we look at the implications for regulation or changes that we can make to our own social media use, a lot of times, we also want to bring up the fact that there's also changes happening in community level that make it really hard to do activities that aren't screen-based.
So specifically, what can we do in like communities or in schools so that there are other opportunities or activities for children to engage in? So, I don't know if it's like boredom per se. Rather, there may not be as much available to teens or children to do outside of school to keep them engaged, and so we're really missing an element there that I think has also been included in some policy recommendations, that it's not just regulating tech, but also, can we create environments or opportunities and communities where children and teens can be engaging with each other, such that it's not so contingent on screen time to keep them occupied. So, I think it's, it's a complicated question.
New York has, this year, a statewide ban on smartphones in schools. How do you feel about that? Have you heard from any educators about how that's going now, in, you know, in the springtime?
Anecdotally, I have heard from educators and folks who work in the schools that they're seeing some benefits. And I think it's really important, though, that we evaluate how does this have an impact on education, socializing, as well as potential mental health benefits? It's not 100% clear yet what the impacts are. So, that's something that I'm really looking forward to investigating and learning more about.
Importantly, while there could be no use during the school day, there are still risks that happen when teens use social media outside of school and when we think about what's been mentioned about, you know, the lawsuit and so forth, we still want to help them learn how to regulate their use and help them cope with experiences that may happen online that can be really hard for them. So, online victimization, cyber bullying, those still can happen even when we have bans or some restrictions in place. So, in addition to regulation around content and these design features, we also have to really equip teens to be healthy, to use social media in healthy ways, and feel resilient when they experience things on social media.
Do you think that is something that could come about as part of a school curriculum, of a social media awareness?
Definitely. So, some of the work that I've done, I mean, I research and in partnership with schools, has been testing some of these curricula where we see, ‘Can we improve digital literacy, and can we improve understandings about how we use social media so that we can get more of the benefits and less of the risks?’ And so part of that is increasing our awareness of how do we feel when we're using social media. This can really go for anyone of any age. Is there content that we're seeing that's bringing us down more than lifting us up? How are we engaging with the content that we're seeing? And part of all this is like helping teens and other users feel equipped to get social media to work better for them, versus the other way around.
And part of what is exciting about coverage of this lawsuit is it opens a door more for conversations. And I think teens and parents can talk more about, well, ‘What's going on behind the scenes in designing these apps? Why is it so hard for us to, you know, put them away or not be pulled into it?’ And I think it affords opportunities for that conversation to kind of question, you know, ‘Why?’ You know, we're not paying for this, right? But we are paying for this in our in our attention, in our energy, and the impacts that it has. So, I think conversations regarding the outcome of this and how we want to engage with technology in different ways is going to be really important.
And, not that you have a crystal ball, but do you think that because of these high profile lawsuits, most recently involving YouTube and Meta platforms like Facebook and Instagram, do you think that this is going to spur more action in Washington to regulate either the content that's on apps, that we talked about, or how that content is delivered and how it does become addictive?
That's a really great question. Folks have called this like a bellwether case. In my conversations with people in like the legal profession, the route going towards content moderation has not been as fruitful because of protections of like, you know, free speech and so forth. So, I do imagine that more will come down the line related to product liability and like these design features. But also, I'm hoping that these will spur interest from other folks who are designing apps putting children front and center, and how do we keep them safe? And can we create content that's positive or pro social? You know, we know in our research, you know, looking at media effects, that there can be benefits, right? Depending on what the content is and who designs it. Is it designed to help children and to help them learn about things in the world? Why can't we take a similar approach when it comes to teens, and so that they have safe and enjoyable spaces online, just like we'd expect in real life. And so, I feel like we'll see more coming down the line, and I hope that there will be more creative endeavors related to, ‘How do we flip this so that risk is not like something we think about later?’ And it's the first, the first thing in designing these products.
Well, Sarah, thank you so much for coming in. I appreciate it.
Yeah, thank you for having me.