November 18, 2024
EU seeks information from X on content moderation amid first major probe under new tech rules
The EU Commission is requested information from X under the Digital Services Act, a ground-breaking law that aims to crack down on illegal and harmful content.

Jonathan Raa | Nurphoto | Getty Images

The European Union is seeking information from social media platform X about cuts to its content moderation resources as part of its first major investigation into the company under its tough new laws governing online content.

The European Commission, the EU executive arm, said in a statement Wednesday that it’s requested information from X under the Digital Services Act, its groundbreaking tech law which requires online platforms to take a far stricter approach to policing illegal and harmful content on their platforms.

The Commission said it was concerned about X’s transparency report submitted to the regulator in March 2024, which showed it had cut its team of content moderators by nearly 20% compared to the number of moderators it reported in an early October 2023 transparency report.

X reduced linguistic coverage within the EU from 11 languages to seven, the Commission said, again citing X’s transparency report.

The Commission said it’s seeking further details from X on risk assessments and mitigation measures linked to the impact of generative artificial intelligence on electoral processes, dissemination of illegal material, and protection of fundamental rights.

X, which was formerly known as Twitter, was not immediately available for comment when contacted by CNBC.

X must provide information requested by the EU on its content moderation resources and generative AI requested by May 17, the Commission said. Remaining answers to questions from the Commission must be provided no later than May 27, the agency said.

The Commission said its request for information was a further step in a formal probe into breaches of the EU’s recently introduced Digital Services Act.

The Commission initiated formal infringement proceedings against X in December last year after concerns were raised over its approach to tackling illegal content surrounding the Israel-Hamas war.

The Commission at the time said its investigation would focus on X’s compliance with its duties to counter the dissemination of illegal content in the EU, the effectiveness of the social media platform’s steps to combat information manipulation and its measures to increase transparency.

EU officials said the requests for information aim to build on evidence gathered so far in relation to its DSA investigation into X. That evidence includes X’s March transparency report, as well as replies to previous requests for information addressing what X is doing to tackle disinformation risks linked to generative AI risks.

The DSA, which only came into effect in November 2022, requires large online platforms such as X to mitigate the risk of disinformation and institute rigorous procedures to remove hate speech, while balancing this with freedom-of-expression concerns.

Companies found to have breached the rules face fines as high as 6% of their global annual revenues.