
Is social media addictive? Is it at least partially responsible for mental health issues in teenagers? That’s what one California court must determine as it hears the case of a young woman who has been using social media since her age was recorded in single digits. The hearing started last week and is estimated to last about six weeks. A recent survey found that most Americans want platforms such as Meta, Instagram, and YouTube to be held accountable for addiction, but Mark Zuckerberg and others claim otherwise.
Social Media on Trial
Zuckerberg and his team didn’t win any brownie points when they showed up for the trial in Los Angeles, CA, wearing their Ray Ban-Meta AI glasses. These pricy accessories are equipped with cameras and can record whatever is in view. However, in LA, many courtrooms have a rule against taking pictures or recordings during proceedings.
“The judge upbraided the Meta team and said if you guys have recorded anything, you have to dispose of it or I will hold you in contempt,” Jacob Ward, a technology journalist and the host of the Rip Current Podcast, told CBS News, calling the incident “an extraordinary misstep” by Meta.
The trial centers around the accusation that Meta, Instagram, and YouTube use algorithms that deliberately addict and harm children. This is the first time the companies will defend and argue their case before a jury. One of the main plaintiffs, Kaley, is 20 years old and has been using social media platforms since grammar school, starting with Instagram when she was just nine years old. At that time, platforms did not have age verifications. She claims she became addicted to the platforms, and it exacerbated her suicidal thoughts.
“Borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, Defendants deliberately embedded in their products an array of design features aimed at maximizing youth engagement to drive advertising revenue,” the lawsuit says. “Plaintiffs are not merely the collateral damage of Defendants’ products. They are the direct victims of the intentional product design choices made by each Defendant. They are the intended targets of the harmful features that pushed them into self-destructive feedback loops.”
Kaley, who suffers from social anxiety has as of the time of this writing yet to appear in court, though she is expected to testify at a later date. Meta argued that her difficult childhood is the reason behind her mental health challenges, not the platform or their products.
However, some aspects and apps have come into question, such as Instagram’s beauty filters, which allow users to alter their photographs, mimicking alterations, including plastic surgery. Kaley’s lawyer, Mark Lanier, argued that these filters can harm young people’s perceptions of themselves. He also claimed experts consulted by Meta agreed with the suggestion. Zuckerberg said the company still decided to allow the filters, but doesn’t recommend them, saying it was done in the name of free expression. Denying users the ability to use the tools would have been “paternalistic,” he opined.
Nick Clegg, a former head of global affairs for Meta and member of British Parliament, raised concern about the image filters. He said Meta would end up “rightly accused of putting growth over responsibility,” which would ultimately have a “regressive” impact on the company’s reputation, BBC reported.
But does all of this mean social media platforms are addictive and companies are stacking algorithms to draw in teens?
Adam Mosseri, who has led Instagram for eight years and is one of the top executives at Meta, doesn’t think so. In testimony to the court, he argued that there was no way to tell how much Instagram use was too much. Using social media is “a personal thing,” which Mosseri said one person could use Instagram “more than you and feel good about it.” He added, “It’s important to differentiate between clinical addiction and problematic use. I’m sure I’ve said that I’ve been addicted to a Netflix show when I binged it really late one night, but I don’t think it’s the same thing as clinical addiction.”
Kaley’s lawyer brought up online bullying and referred to an internal Meta survey where the company asked 269,000 Instagram users about their experiences and 60% had seen or experienced bullying in the previous week. Lanier then informed the court that Kaley had made more than 300 reports to Instagram about bullying on the platform and Mosseri admitted he had not been aware.
Kaley’s longest single day of use on Instagram was 16 hours and Mosseri said, “That sounds like problematic use,” instead of referring to it as an addiction.
Americans Think Platforms Should be Held Responsible
A YouGov survey, commissioned by the Tech Oversight Project, from earlier this month revealed that 86% of Americans want Meta and Google to be held accountable for their role in creating an addiction crisis. Furthermore, 67% said they were more likely to vote for lawmakers who supported legislation that would crack down on “dangerous social media features like infinite scroll, near-constant notifications, and predatory algorithms,” the survey revealed.
The Tech Oversight Project called the social media trial a “watershed moment.” It is, as the New York Post explained, seen as a bellwether that could decide how similar trials are decided for years to come.
“This trial has already proven that there is a direct link between Big Tech’s dangerous product designs and real-world harms, and it should come as no surprise that voters are mad as hell and want Congress to do something about it,” Sacha Haworth, executive director of the Tech Oversight Project, told The Post.
According to the survey results, internal documents revealed “companies buried research showing their products had damaging effects,” and respondents to the survey agree the social media giants should be held accountable. The trial, which will continue for weeks, may set the standard for future lawsuits brought against social platforms. If successful, this argument could change the perception of companies’ First Amendment protections and Section 230, which was designed to protect tech companies from being liable for what their users post on their social media platforms.
















