Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Multiple 14-year-old undercover accounts, all overwhelmed by pornography... The U.S. “fishing enforcement” actions cost Meta 2.5 billion yuan | Silicon Valley Watch
| 《Silicon Valley Watcher》 column by Zheng Jun
Investigators created multiple fake accounts posing as children under 14. The results showed that these accounts were quickly flooded with large amounts of explicit sexual content, and they also received a huge number of sexually suggestive messages from adults……
In just two days, Zuckerberg suffered losses in two key lawsuits.
This is a landmark ruling. For the first time, a social media giant was found liable for users’ mental health; and after losing this benchmark lawsuit, Meta may face similar accountability lawsuits across the United States and even around the world—protective umbrellas under which internet giants do not have to take responsibility are about to end.
Giant loses landmark lawsuit
Last Wednesday, a jury in the Superior Court of Los Angeles County in California returned a verdict: Meta and YouTube, both under the YouTube umbrella of Google, are legally responsible for harm to the mental health of a young woman, and must pay her a total of $6 million (about RMB 41.43 million) in a 73-to-27 ratio. This is the first time in U.S. history that a jury has found that a social media platform must take responsibility for users’ mental health.
The central plaintiff in the case is a woman in California. Court documents refer to her as “KGM” or “Kaley.” Now 20 years old, she claims in her complaint that she started using YouTube at age 6, and started using Instagram at age 11. After long immersion in these two platforms, she developed a serious addiction, which in turn led to depression, body dysmorphic disorder, and suicidal thoughts.
Her mother said that Kaley sometimes used Instagram for up to several hours a day. There were records showing that her usage time in a single day exceeded 16 hours—despite the mother repeatedly trying to limit her use of Instagram.
It should be added that she originally sued four social media giants at the same time. But right on the eve of the hearing, TikTok and Snapchat chose to settle with her, thereby exiting the lawsuit; the compensation amount was not announced. Meta and YouTube, both under Google, refused to settle and decided to continue fighting the lawsuit with her.
Platform knew the risks but ignored them
After nine days of intensive deliberation totaling more than 44 hours, the jury ultimately found in favor of the plaintiff on all seven counts. The jury found that Meta and YouTube were negligent in their platform design and operations, and that their negligence was a “substantial factor” causing Kaley’s harm. The jury also found that the two companies knew that the platform could negatively affect minors but failed to fulfill sufficient duty to warn.
The most shocking part of the trial was when the plaintiff’s lawyer made Meta and Google’s internal research documents public. These documents show that the research teams of both companies had long known that their products had negative effects on teenagers’ mental health, yet shelved these findings.
Several professionals who had provided therapy to Kaley testified in court. One therapist, Victoria Burke, said that social media use was “closely related” to Kaley’s self-perception, and that interaction activities on the platform could even “influence the ups and downs of her emotions.” This directly contradicts Meta’s claim that “no mental health professional has identified social media as a cause.”
The chain of evidence presented in court indicates that the product design of Instagram and YouTube—including personalized recommendation algorithms based on behavioral data, notification mechanisms that manufacture a sense of continuous reward, and infinite scroll features that eliminate stopping resistance—was not unintentional. Instead, it was carefully designed to maximize users’ time spent on the platform.
In addition, the jury also found that the conduct of the two companies constituted “malice, oppression, or fraud,” and therefore, in addition to $3 million in compensatory damages, another $3 million in punitive damages was added.
As for the apportionment of liability, the jury found that Meta bears 70% of the responsibility and YouTube 30%. The reason for this breakdown was that the jury believed Instagram’s algorithmic recommendations, infinite scrolling, and continuously pushed notifications were the main causes; while YouTube’s defense attorneys insisted that their platform is fundamentally a video streaming service rather than a social media platform, closer to television than to Instagram.
Zuckerberg forced to appear in court personally
The trial lasted seven weeks. Both Meta founder Mark Zuckerberg and Instagram head Adam Mosseri testified in court in person.
This is extremely rare in the tech industry, and it is also Zuckerberg’s first time testifying in court about his own products.
The courtroom was nearly full—dozens of parents traveled from across the country to attend, and some even slept overnight on the steps of the courthouse just to ensure they could get a seat. The plaintiff, Kaley herself, also sat in the audience and watched all of this firsthand.
Outside the Los Angeles County Superior Court building, dozens more parents gathered hand in hand, keeping watch over their children who had passed away or been injured. Two parents interviewed by the media described their tragedies: Julianna Arnold’s daughter allegedly died after buying fentanyl on Instagram; Joann Bogard’s son died after imitating a “choking challenge” video he saw on YouTube.
Plaintiff’s attorney Mark Lanier compared the two tech giants to “lion predators that prey on small gazelles,” accusing them of systematically exploiting underage users by leveraging resource and technical advantages. Matthew Bergman, founder of the Center for Media Victims, said that this trial will be the first to let the public understand “everything social media companies have done—at the expense of our children’s safety—for profit.”
The ruling in this lawsuit is of great significance and has been identified as a “benchmark case.” In other words, the outcome of this case will directly shape the direction of hundreds of similar lawsuits across the United States. Hundreds of families and school districts waiting with similar harm will continue to push forward lawsuits against Meta.
Therefore, both Meta and YouTube said they strongly opposed the ruling and announced they will appeal. In a statement, a Meta spokesperson said: “Teenage mental health issues are extremely complex and cannot be blamed on a single application.” A Google spokesperson, José Castañeda, argued that: “This case rests on a fundamental misunderstanding of YouTube—YouTube is a responsible video streaming platform built for users, not a social media platform.”
Phishing for enforcement catches criminals
The day before the Los Angeles ruling, a jury in New Mexico returned a verdict in another parallel case, finding that Meta intentionally violated the state’s consumer protection law by failing to adequately protect child users on the platform who were harassed by online predators, and it must pay a civil penalty of $75k.
This is not only a victory for New Mexico, but also the first jury verdict in the United States to convict a social media giant over child safety issues. How was this fine amount determined? The law provides for a maximum penalty of $5,000 per violation. The jury found that Meta’s violations affected 75k minor users. Therefore, the jury chose the statutory maximum for punishment—$5,000 multiplied by 75k, totaling $375 million (about RMB 75k).
New Mexico Attorney General Raúl Torrez is the initiator of this lawsuit. In the complaint, he accused Meta of knowing that its platform—especially Instagram and Facebook—was being used by criminals to commit sexual exploitation of children, yet it deliberately concealed the relevant information and refused to take effective protective measures. The jury accepted this claim and found that Meta’s actions violated New Mexico’s laws on unfair business practices.
The most startling evidence in this case came from an undercover operation conducted by New Mexico’s Department of Justice. Investigators created several fake accounts posing as children under 14. The results showed that these accounts were quickly flooded with large amounts of explicit sexual content within a very short time, and they received a huge number of sexually suggestive messages from adults.
This evidence directly punctured Meta’s so-called “algorithmic safety filters.” At the same time, the police also arrested several suspects who tried to meet “minors” in person. Three of them showed up as scheduled at the agreed-upon motel, intending to have sex.
A key basis for the court’s decision was Meta’s internal documents. The documents showed that company leadership (including Zuckerberg and Mosseri) had long known that its algorithms would connect predators with minors, but in order to maintain high daily active users and profits, it refused to take effective age verification and safety protection measures.
Unlike the Los Angeles case, which focuses on product design defects, the core of the New Mexico case is fraud and concealment. The two cases approach the issue from different angles, yet point to the same conclusion: social media platforms systematically place commercial interests ahead of minors’ safety.
It is worth noting that this case has not yet fully ended. On May 4 this year, New Mexico will begin the second phase of the “Bench Trial” regarding the case. At that time, the attorney general will ask the judge to determine whether Meta constitutes a “public nuisance,” which could force Meta to change its platform algorithms and pay additional damages.
Breaking the “Section 230 protection umbrella”
Before understanding the historical significance of these two rulings, it is necessary to clarify a key legal background: Section 230 of the Communications Decency Act of 1996. For many years, this statute has been the most important legal shield for tech companies, granting internet platforms an exemption from liability for user-posted content.
Over the past several years, most U.S. user lawsuits against social media platforms have been dismissed by courts due to the protection of Section 230. However, the plaintiff’s attorneys in the Los Angeles case adopted a completely different legal strategy—they shifted the focus of the lawsuit from content on the platform to the platform’s product design itself.
Features such as infinite scrolling, algorithmic recommendations, and continuously pushed notifications in social media platforms are essentially product design decisions of the platform, not user-generated content. It was this change in strategy that allowed the plaintiffs to bypass the protection of Section 230 and go directly after the platform’s liability for infringement.
Catherine Sharkey, a law professor at New York University, calls this a “redefinition for a new era.” She points out that the core issue is that: platform engineers know the effects of these designs on addiction, and the platform’s internal research also knows the risks faced by teen users—these information asymmetries provide the reason for the court to hold platforms responsible.
The Los Angeles case is also a “test case” under California’s judicial coordination program, and its ruling will have precedential effect for more than 1,600 similar lawsuits in the state, including lawsuits filed by more than 350 families and 250 school districts. If Meta loses this lawsuit, it means that in the future, endless litigation will be at a disadvantage.
Another federal class action lawsuit is also expected to go to trial in the Northern District of California this summer, at which time TikTok and Snap will be on the same stage in court. Jessica Nall, an attorney in San Francisco, put it more directly: “The gates have already opened.”
Global alarm raised for minor social media addiction
The risks of minors becoming addicted to social websites have become a global problem. Over the past two years, the scientific, political, and judicial communities in various countries have raised alarms from multiple perspectives and begun to take corresponding regulatory measures almost in unison.
In 2023, U.S. Surgeon General Vivek Murthy issued a rare call, recommending mandatory health warnings on social media products, likening them to tobacco products. In the report, he cited a large body of research showing a significant association between social media use and depression, anxiety, sleep disorders, and suicidal ideation among adolescents.
U.S. psychologist Jonathan Haidt, in his bestselling book “The Anxious Generation,” further directly linked the widespread adoption of smartphones in 2012 to the outbreak of the subsequent mental health crisis among teenagers, resonating strongly with the public.
At the legislative level, Australia is leading the world. In November 2024, the Australian Parliament passed the “Enhancing Online Safety Amendment (Social Media Minimum Age) Act,” and it officially took effect on December 10, 2025.
This first nationwide ban on minors’ social media in the world requires major platforms—including Instagram, YouTube, TikTok, Facebook, X, Snapchat, and Reddit—to take reasonable measures to prevent users under 16 from creating accounts. Platforms that violate the rules will face fines of up to about $33 million.
Australia’s legislation directly stems from a letter—one mother in Sydney wrote to Prime Minister Albanese, describing how her 12-year-old daughter, Charlotte, died by suicide after being bullied on social media. The letter moved lawmakers and also ignited public opinion.
In Europe, France has规定 that minors under 15 may not use social media without parental consent; Denmark plans to set the ban age line at 15; and in the UK, regulators are assessing proposals for age limits or restrictions on daily usage time.
Within the United States, legislative actions by states have also clearly accelerated. Florida has already passed a law prohibiting minors under 14 from using social media, while those aged 14 to 15 need parental consent; Tennessee and Mississippi, starting in 2025, require users under 18 to undergo age verification and obtain parental authorization; Virginia states that without parental consent, minors’ daily use of social media may not exceed one hour; similar regulations in California and Minnesota will take effect in stages from 2026 to 2027.
At the level of the U.S. federal government, the regulatory environment is uncertain. Since taking office, the Trump administration has actively embraced the tech industry, and its regulatory approach is clearly tilted toward market freedom. The proposed “Children’s Online Safety Act” has passed in the Senate, but it has stalled in the House of Representatives due to severe disagreements between the two parties.
Only after federal legislation is formally passed can the 230 protection umbrella for social media giants be officially removed. Social media platforms need to take responsibility for every addicted teen.
An abundance of information and precise analysis—exclusively in the Sina Finance app
责任编辑:宋雅芳