
Photo: France 24
The European Commission has issued a major warning to Meta and TikTok, accusing the two social media giants of violating transparency obligations under the Digital Services Act (DSA) — a sweeping piece of EU legislation designed to rein in the influence of Big Tech and enforce greater accountability across online platforms.
The Commission’s preliminary findings, announced on Friday, claim that both companies failed to offer researchers “sufficient access” to public data — a requirement intended to ensure independent scrutiny of online content, algorithmic influence, and potential harm to users, especially minors.
If confirmed, the violations could lead to fines of up to 6% of each company’s global annual turnover, a penalty that could amount to billions of dollars for both Meta and TikTok’s parent company, ByteDance.
The DSA, enacted in 2023, represents one of the most ambitious digital regulatory frameworks in the world. It demands that very large online platforms — defined as those with more than 45 million EU users — maintain strict transparency standards, provide access to researchers studying social impacts, and establish clear mechanisms for users to report and appeal content decisions.
According to the Commission, Meta breached multiple provisions of the act on both Facebook and Instagram. Officials say the company failed to provide simple and effective tools for users to report illegal content and challenge moderation decisions.
Meanwhile, TikTok was accused of making it overly complicated for independent researchers to access necessary public data, leaving them with “partial or unreliable” information, which the EU says hampers legitimate studies on issues such as misinformation, online radicalization, and youth mental health.
Both companies have strongly denied wrongdoing.
A spokesperson for Meta, Ben Walters, said the company “disagrees with any suggestion” that it has breached the DSA, emphasizing that Meta has already implemented new reporting and appeals processes in compliance with the law. “We’re confident that our systems align with EU standards,” he said.
TikTok, in its response, highlighted its “commitment to transparency” and noted that nearly 1,000 research teams have been granted access to its internal data tools so far. However, the company also argued that the EU’s demands may conflict with privacy protections outlined in the General Data Protection Regulation (GDPR).
“If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled,” the TikTok spokesperson added.
The European Commission’s Vice President for Digital Policy has emphasized that data access for independent researchers is critical for evaluating the social and psychological impacts of online platforms. Regulators say companies that withhold or manipulate access are undermining public oversight and preventing meaningful research into online harms.
Experts note that the DSA represents a turning point in how the EU manages digital ecosystems, with a focus on accountability, user safety, and algorithmic transparency.
The Commission has already opened multiple DSA and Digital Markets Act (DMA) investigations this year, signaling a broader crackdown on platforms that fail to meet new European standards.
This is not the first time either company has faced European penalties.
Earlier this year, Meta was fined €200 million ($228 million) for violations related to user consent under the Digital Markets Act, while TikTok was hit with a €530 million fine by Ireland’s Data Protection Commission for transferring user data to China and failing to safeguard minors’ privacy.
If the current accusations lead to formal rulings, the financial implications could be enormous. A 6% fine on Meta’s 2023 global revenue would total more than $7 billion, while ByteDance could face penalties exceeding $5 billion, depending on final assessments.
Meta and TikTok now have the opportunity to review the Commission’s findings and respond in writing before any final decision is made.
If the EU upholds its preliminary conclusions, it can issue a non-compliance decision, triggering substantial fines and potentially forcing both companies to revise their data-sharing frameworks and content moderation policies across Europe.
For Brussels, this case underscores a clear message: no tech giant is above European law. For Meta and TikTok, it’s another reminder that the era of self-regulation in digital media is over, and the age of strict transparency and accountability has fully arrived.









