Regulator Cites Potential Violations of EU Digital Services Act
Ireland’s media regulator, Coimisiún na Meán, has launched formal investigations into TikTok and LinkedIn, citing potential violations of the European Union’s Digital Services Act (DSA).
This development follows a similar investigation initiated against Elon Musk’s X last month, marking these as the first DSA enforcement actions undertaken by the Irish regulator.
Focus on Content Reporting Mechanisms
According to a Bloomberg report, the investigations into TikTok and LinkedIn are centered on potential flaws within their content reporting mechanisms. The probes aim to determine whether the platforms' systems for users to report suspected illegal content are sufficiently accessible, user-friendly, and anonymous, as mandated by the DSA.
John Evans, Digital Services Commissioner at the regulator, stated, "Providers need to have reporting mechanisms, that are easy to access and user-friendly, to report content considered to be illegal."
While the European Commission serves as the primary enforcer against very large online platforms within the EU, certain aspects of the DSA, including reporting mechanisms, fall under the jurisdiction of the national regulator in the EU country where a platform is headquartered.
Companies found to be in violation of Europe’s digital rules, as determined by Ireland’s media regulator, face potential fines of up to 6% of their annual global sales.
This is not the first instance of Irish regulators targeting social media platforms. TikTok previously received a €530 million penalty in May 2025 for violating the EU’s General Data Protection Regulation (GDPR), and LinkedIn was fined approximately €310 million for various regulatory breaches.
Previous Investigation into X
The current investigations into TikTok and LinkedIn come weeks after the same regulator opened an investigation into Elon Musk’s social media platform X, alleging that the company is failing to remove content that users report as illegal.
Virkkunen, Executive Vice-President of the European Commission for Technological Sovereignty, Security, and Democracy, explained that while the DSA permits automated moderation, it also requires platforms to possess effective internal complaint-handling systems and grants users the right to appeal content moderation decisions.
"While automated moderation is allowed, online platforms must be transparent about its use and accuracy," Virkkunen added.
Coimisiún na Meán's investigations are designed to ascertain whether X's internal complaint handling system complies with the DSA's regulatory standards or is in violation of them. The investigation is supported by various sources, including the nonprofit HateAid, which had previously taken legal action against X on behalf of a researcher who experienced repeated platform bans.
This investigation marked the first under the DSA by Coimisiún na Meán, and violations could lead to fines equivalent to almost 6% of the company’s turnover. X has faced scrutiny from European official bodies before; earlier this year, the platform was investigated by the EU regarding potential breaches of the bloc’s content laws.
Experts suggest that this scrutiny may lead to broader changes in X’s operational and content moderation policies. The recent investigations underscore the EU's commitment to holding social media platforms accountable and ensuring that user protection measures are robust and enforceable.

