
Meta CEO Mark Zuckerberg arrives at Los Angeles Superior Court on Feb. 18, 2026.
Jill Connelly | Getty Images
Mark Zuckerberg took the witness stand in a closely watched social media safety trial, offering rare insight into how Meta approaches teen protection, platform design, and corporate responsibility. During his testimony, Zuckerberg disclosed that he personally contacted Tim Cook to explore ways the tech industry could better support the wellbeing of young users.
The Los Angeles proceedings are part of a broader wave of litigation that analysts say could reshape how social platforms are regulated, often compared to the tobacco industry’s legal reckoning decades ago.
In court, Zuckerberg referenced a 2018 email exchange with Cook, explaining that he saw potential collaboration opportunities between Meta and Apple. He emphasized that the outreach reflected a personal and corporate priority to address how teens interact with digital products.
The communication was introduced to demonstrate that Meta had taken proactive steps on youth safety, even engaging with a major competitor to discuss solutions. Zuckerberg reiterated that protecting younger users has been an ongoing focus inside the company.
The case centers on claims that social platforms contributed to harmful usage patterns and mental health challenges, particularly among adolescents. The lawsuit stems from a young woman’s allegation that she developed compulsive habits tied to apps such as Instagram and YouTube.
Meta disputes these claims, arguing that while problematic use can occur, it does not equate to clinical addiction. The company maintains that the key legal question is whether its products were a substantial factor in the plaintiff’s struggles.
A central point of contention involved internal discussions about engagement metrics. Lawyers pointed to historical communications suggesting the company prioritized increasing time spent on Instagram. Zuckerberg pushed back, saying such references reflected aspirations or benchmarking rather than formal corporate objectives.
He acknowledged that internal milestones are sometimes used to measure product performance against competitors but stressed that Meta’s stated mission remains centered on connection and communication rather than maximizing screen time.
Another major topic involved augmented reality filters associated with cosmetic enhancements. Plaintiffs argued these features could negatively affect body image, especially among teenage girls. Zuckerberg said Meta reviewed expert feedback and consulted stakeholders but ultimately weighed those concerns against principles of user expression.
He testified that while the company temporarily restricted certain filters, it later allowed them again without actively promoting them, citing insufficient causal evidence linking the features to measurable harm.
The court also examined how effectively Meta prevents children under 13 from accessing its services. Documents presented suggested millions of underage users may have been on Instagram in the U.S. at various times.
Zuckerberg said the company removes accounts it identifies as underage and requires users to confirm their age at sign-up. He added that more robust verification tools may ultimately depend on operating-system providers such as Google and Apple, which control app ecosystems.
The hearing included an unusual warning from the judge after reports that attendees might be recording proceedings using AI-enabled smart glasses. The court threatened contempt sanctions, underscoring the heightened scrutiny surrounding the case.
Zuckerberg also addressed questions about his control over Meta, reiterating that his voting power gives him significant influence over board decisions, a governance structure that has long drawn investor attention.
This trial is one of several major cases unfolding across the U.S. involving platforms including TikTok and Snap, both of which reached settlements related to the same plaintiff before proceedings began. Additional lawsuits, including actions brought by state authorities, allege failures to adequately protect minors from online harms.
Collectively, these cases could shape future standards for platform design, age verification, and transparency around algorithmic recommendations.
The testimony highlights the growing pressure on social media companies to demonstrate measurable safety safeguards while balancing user autonomy and free expression. For investors and policymakers, the outcome of these trials could influence regulatory frameworks, compliance costs, and product strategies across the sector.
Zuckerberg’s appearance in court offered a detailed look at how Meta defends its approach to teen safety and product development. His disclosure of outreach to Apple underscores an industry-wide recognition that protecting young users may require cross-platform cooperation.
As litigation continues and regulatory scrutiny intensifies, the case stands to become a defining moment in how digital platforms are held accountable for their societal impact.









