The European Commission has issued preliminary findings that both TikTok and Meta have breached transparency obligations under the Digital Services Act (“DSA”) by failing to grant researchers adequate access to public data (i.e. data that is available on a platform’s public interface, such as the number of likes, comments or reposts for a piece of content). This requirement is a DSA obligation intended to facilitate independent scrutiny of the impact of the largest online platforms in the EU. The Commission has also made a preliminary finding that Meta has not provided users with DSA compliant mechanisms to report illegal content and to challenge content moderation decisions.
Data access for researchers
The DSA requires “very large online platforms” in the EU to provide certain researchers with adequate access to their public data for the purposes of conducting research into the systemic risks posed by such platforms (for example around whether users are exposed to illegal content). The Commission’s preliminary findings suggest that TikTok and Meta have put procedures and tools in place which add barriers when those researchers request access to such data, resulting in them receiving partial or unreliable data that impacts their ability to conduct such research.
Meta’s “notice-and-action” and appeal mechanisms
In addition, the Commission preliminarily found that Meta’s Facebook and Instagram platforms are in breach of the DSA’s obligations to provide users with a simple, user-friendly “notice-and-action” mechanism to report illegal content, and to allow users to effectively challenge content moderation decisions. The Commission points to unnecessary steps in Meta’s reporting flows, additional demands on users, and the use of deceptive interface designs (so‑called “dark patterns”) that may confuse or dissuade users. The Commission also highlights that the decision appeal mechanisms of Facebook and Instagram do not appear to allow users to provide explanations or supporting evidence to substantiate appeals, making it difficult for users in the EU to further explain why they disagree with Meta's content decision, and so limiting the effectiveness of the appeals mechanism.
The Commission’s investigations into Meta’s reporting tool, dark patterns and complaint mechanism have been conducted in cooperation with Coimisiún na Meán, the national regulator in Ireland, where Meta has its European headquarters. Coimisiún na Meán referred dozens of user complaints into the Commission’s investigation process.
Meta and TikTok’s responses
Meta has stated that it is not breaching the DSA. It notes that it has introduced changes to its EU reporting, appeals, and data access tools since the DSA came into force and is confident these comply with the relevant legal requirements. TikTok says it is reviewing the findings but has argued that easing data safeguards to meet researcher access obligations could place the DSA in tension with the GDPR, and so has called for regulators to clarify how to reconcile the requirements.
What’s next?
The Commission’s findings are preliminary views within ongoing formal proceedings. TikTok and Meta now have the opportunity to examine the case files, respond in writing, and propose remedies. If the Commission ultimately determines that Meta and TikTok have breached the DSA, it can impose penalties of up to 6% of a company’s total worldwide annual turnover, alongside potential periodic penalty payments to compel compliance.
Related developments
While these investigations focus on access to public data, the DSA also enables certain vetted researchers to have the right to access certain non-public data for the purpose of conducting research into systemic risks. A new framework for such access has now been put in place by the Delegated Act on data access under the DSA, which came into force on 29 October 2025.
For more information on the DSA generally, see our previous blogs here.
 
				 
             

/Passle/5badda5844de890788b571ce/SearchServiceImages/2025-10-21-14-58-10-278-68f79f82c103613cebe31523.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2025-10-21-13-25-06-352-68f789b294603fa399123a90.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2025-10-21-10-40-16-318-68f7631094603fa39911c658.jpg)
/Passle/5badda5844de890788b571ce/SearchServiceImages/2025-10-20-10-26-54-862-68f60e6ea145dcbe544e595c.jpg)