The European Commission has rejected claims made by Meta CEO Mark Zuckerberg that European Union data laws amount to censorship of social media platforms, asserting that EU regulations merely require large platforms to remove illegal content and not content deemed legal.
Zuckerberg, in a recent statement, accused the EU of having an increasing number of laws that institutionalise censorship, making it difficult for innovative platforms to operate. He expressed concern over the impact of these regulations, saying, “Europe has an ever-increasing number of laws institutionalising censorship and making it difficult to build anything innovative there.”
This criticism followed Meta’s decision to end its U.S.-based fact-checking programs, as the company opts for a new approach to moderating content. Zuckerberg also indicated that he planned to work with then-President-elect Donald Trump in an effort to combat what he perceived as global censorship.
In response, the European Commission, which oversees EU data and digital laws, strongly denied any allegations of censorship. A Commission spokesperson clarified that the Digital Services Act (DSA) does not require platforms to remove lawful content. Rather, the law mandates the removal of content that is deemed illegal or harmful, such as content that could endanger children or threaten the integrity of democratic processes within the EU.
“We absolutely refute any claims of censorship,” the spokesperson stated. “Our regulations are aimed at ensuring the safety and security of users, not at controlling or restricting free expression.”
The Digital Services Act, which came into effect in November 2022, imposes stricter regulations on large online platforms, compelling them to take more responsibility in moderating content. These measures were introduced to curb the spread of harmful content online, with a particular focus on ensuring the protection of vulnerable groups, such as children, and preserving the democratic values within the region.
Meta’s recent move to scrap its U.S. fact-checking programs raised concerns about the company’s commitment to responsible content moderation. The social media giant has also introduced a new content moderation model, dubbed “community notes,” which will replace its fact-checking teams on platforms such as Facebook, Instagram, and Threads. This system, already used by the platform X, allows contributors to add a note to posts they believe may be misleading, with the note becoming public if a sufficient number of users from different perspectives deem it helpful.
In response to Zuckerberg’s plans, the European Commission made it clear that for platforms to adopt a similar community-based approach in the EU, they must first conduct a thorough risk assessment and submit it to the EU executive. The Commission pointed out that while it does not mandate specific content moderation techniques, any approach adopted by platforms must be effective.
“Whatever model a platform chooses needs to be effective, and this is what we’re looking at,” the Commission spokesperson said. “We are monitoring the effectiveness of the content moderation measures adopted and implemented by platforms here in the EU.”
The EU emphasized that independent fact-checking would continue to be available to European users. This means that even though Meta is replacing its fact-checking programs in the U.S., EU users will still benefit from the oversight of independent fact-checkers ensuring the accuracy and reliability of content across online platforms.
The disagreement between Meta and the European Commission reflects the broader global debate over content moderation and regulation. While Meta and other tech companies argue that some regulations infringe on free speech, regulators in the EU maintain that their laws are designed to ensure digital spaces are safe and respectful for all users.
With the rise of misinformation and harmful online content, particularly in the lead-up to elections and major events, the European Commission insists that content moderation remains a crucial tool for safeguarding the democratic values of the region. The ongoing tension between tech giants and regulators is set to continue shaping the future of digital content governance globally.