Let’s talk about trials, baby…
Breaking down the Meta and YouTube trial with kids
As many of you have likely read, Meta and YouTube were found negligent in landmark social media cases this week. I was able to squeeze in a moment to have a good conversation with each of my kids about the news, and wanted to share what I found unique about them.
Despite their age differences, one thing was clear: court cases really landed with my kids. And, if yours are like mine, these cases in particular offer a genuine opening to talk, listen and connect around our relationship with technology.
Trials? What trials?
In case you haven’t had time to catch up on the details, I found the following helpful: NPR, NY Times, The Guardian, & the Center for Humane Technology:
Why trials?
I had a hunch this news could get my kids talking. Boy, was I right.
Starting with a court case felt materially different than starting with one of mom’s ideas based on a book or study that I just read (something I subject my kids more often than I realize). Studies never seem to sink in like I’m hoping. Maybe that’s because we humans tend to think, “that study might apply to others, but not to me.” My ideas can also land as mom’s worries, and even sometimes as mom’s lack of trust, which can limit the conversation.
Whatever the reason, the 11, 13, and 15 year olds all responded with curiosity and seemed genuinely moved. A jury decided the companies were guilty. There seemed to be power in the right vs. wrong of it all. This wasn’t “mom trying to sway me” with social science. It was the law. It was justice, and we could just start from there.
My goals for chatting
In each of the conversations, I had three goals in mind for the kids:
Know and understand what happened, as it is unfolding (makes it feel real).
Consider the trial and outcome with curiosity and be able to ask clarifying questions.
Continue to build agency about what kids, themselves, can do to be healthy and safe as they navigate the use of technologies and navigate their world in general.
Questions & prompts
To follow are the prompts I loosely followed. If you plan to speak with your kids, my advice would be to focus on one of the cases (I chose the case in LA, California), and try to balance helping provide context with giving plenty of space for kids to react, wonder and explore their thoughts.
Did you know two major court cases were just decided — both about social media and young people? (If yes) What have you heard?
One of the cases was decided in Los Angeles. A woman called KGM — now 20 — sued several social media companies, claiming their products were deliberately designed in ways that became addictive and caused damage to her mental health. She’d been using some of the products from age 6, 9 or 10 (depending on which app).
Here’s some background before we dig in:
These products have an impact on all of us, but experts believe they have an especially strong impact on young people. Your minds–especially the parts dedicated to social feedback and rewards–are extra powerful at this age, and they are extra vulnerable to the way these products are designed to work.
The mental health issues she developed included (check for understanding on each):
Body dysmorphia — uncontrollable worry about flaws in your appearance or body (not just worry, it gets in the way or gets unhealthy for you)
Anxiety — persistent, uncontrollable fear that interferes with daily life
Depression — lasting sadness and loss of interest in things you used to enjoy
Can you make any guesses about which apps she blamed for that impact? [Pause for ideas] → Snap Chat, TikTok, Instagram, YouTube.
Which features do you think she said were most responsible for causing harm? [Features are the different ways the product is designed to work. An example might be a like button that someone can push when they see and like something you post.]
Infinite scrolling and autoplay
Algorithms that decide what you see next
Notifications
Camera filters
Can you make any connection between these features and mental health struggles? Any reflections on your own experience with some of these features?
More updates:
Snap and TikTok settled quietly out of court. Meta (maker of Instagram) and Google (maker of YouTube) decided to fight it.
Meta’s argument: Mental health is too complicated to blame on one app. Google’s argument: YouTube isn’t even social media — it’s just videos.
What do you think? Are these products designed in ways that could cause real harm? Are the companies responsible?
Here’s a twist: The jury saw internal emails showing that people at Meta and Google knew about these harms — and they continued actively encouraging young people to use their products anyway.
What do you think about that?
The result? Ten out of twelve jurors found Meta and Google liable. This is being called a bellwether case — meaning it may signal how thousands of similar cases waiting to go to trial across the country could go. Both companies disagree with the decision.
One more connection: When I was just out of college (or whatever the late 90s were for you), similar trials were happening against tobacco companies, who also knew their products (cigarettes) harmed young people’s health and marketed to them anyway. Those cases led to major changes in how cigarettes could be advertised and sold, especially to young people. People are wondering if these cases will lead to companies like Meta and Google to make changes to their products to keep kids more safe.
What changes do you think social media companies could make to keep people who use their products more safe?
And (no matter what companies decide to do) what do you think you can do to keep yourself safe?
A sample of perspectives from my kids
Even though my 11 year old has neither a phone nor social media, she’s aware of both the products and the issues. She sees how people she cares about use them. Talking with her, it felt none too early to dig into the dangers and support her sense of agency around persuasive technologies.
She shared about YouTube,
“For me, there’s like a transition when I’m looking up something like a slime recipe, basketball move, or hair style, and then I see a link for a cool video, and I press on it, and it becomes a weird short for some reason. Or maybe I get a few cool videos, and then it just goes to something weird. I try to click out, then go back in. And it still does the same thing, which just isn’t cool.”
This gave us a great chance to talk about what algorithmic feeding is and how it works.
My 13 year old got an iPhone less than a year ago, and we are in regular conversation and negotiations about its power, its limitations, and how to balance it all. It’s real work, y’all. This conversation, though, felt great. It wasn’t me having a point of view or trying to make a point about a particular feature. It felt more open, generative, and kid-led.
It was notable that, to this kid, it felt important to acknowledge that these products, though dangerous, offer real enjoyment, in addition to harms. No simple stories would work for them.
That said, even though the apps are cool, brazen deception ain’t. The biggest turning point in this conversation was the moment my kiddo learned about the internal emails. They were aware that the companies were motivated to make money, but it hadn’t really sunk in that the companies knew, and that they leaned in on young people anyway.
Just like for my youngest, my 13-year-old also shared concerns about YouTube.
“You guys don’t let me have TikTok, but YouTube is just as bad. I mean, kids are creating videos. Kids are watching a LOT, and the stuff they show you can get whack.”
This led to some good conversation about how we can set better limits on YouTube.
My favorite insight? “It’s not good to stay on anything too long. But I think the most important thing isn’t even how long you’re on; It’s what else you do. You’ve got to do other stuff.” This inspired me to reread Jenny Anderson’s recent piece Teens should be doing stuff. Here’s how to help them and share some of the ideas with my kids.
My nearly 16-year-old was immediately pleased at the accountability and shared the most informed perspective on the companies, the legal process, and the impact these apps can have. Though she feels she’s struck a healthy relationship with them, she was the most excited about the verdict, felt it was a long time coming, and was eager to start and keep talking about how to make these products safer for everyone. She also felt like age restrictions were easy to get around, and would rather see people widely move away from harmful tech.
“I hope people keep coming forward. Like, the more individuals who win cases, the more these platforms just become less cool and people will start pushing back on the companies and using things (apps) less. I kind of hope it’s already starting to happen.”
I, personally, love the idea of a culture shift and a greater shared awareness like she describes, as it could serve all of us.
If you’ve been looking for an “in” to talk about platforms and social media apps with your kids, I highly recommend finding 10 minutes this weekend to chat about a recent ruling. If you do talk, let us know how your conversations go!



