08/20/2025 / By Laura Harris
The Dutch Authority for Consumers and Markets (ACM) is once again putting pressure on major tech platforms to step up enforcement of content policies under the guise of protecting “electoral integrity” ahead of the Netherlands’ national elections on Oct. 29.
According to several reports, the ACM, which seeks to ensure platforms are prepared to tackle “disinformation” and “illegal hate content” during the final stretch of the election campaign, summoned a dozen digital platforms, including X (formerly Twitter), Facebook and TikTok, to a closed-door meeting scheduled for Sept. 15. (Related: EU pushing through with enforcement of censorship tool Digital Services Act.)
The September session will be held in coordination with EU regulators and civil society groups, many of whom are strong advocates for stricter online moderation. The gathering reflects a wider European shift, driven in part by the new Digital Services Act (DSA), toward centralized control over online discourse.
Under the DSA, which came into effect earlier this year, governments can demand the removal of content deemed “harmful” or “illegal” – terms critics say are broad and easily politicized. In the Netherlands, the ACM is tasked with enforcing the law and has taken an aggressive approach.
ACM director Manon Leijten even claimed that Very Large Online Platforms (VLOPs) must “take effective measures against illegal content” and ensure their content policies are both “transparent and diligent.”
“Great online platforms have many users. Under the Digital Services Act (DSA), they must implement a transparent and careful policy regarding content on their platforms and take effective action against illegal content. This is especially important in elections. Taking these measures is not only a legal, but also the social duty of large online platforms,” Leijten said, defending the action.
This is not the ACM’s first move. In July, the ACM sent a letter to the same major tech platforms to lay out a detailed set of expectations for how digital giants must manage content and mitigate potential risks to the upcoming national elections. Platforms classified under the DSA, including social media networks and search engines, are now officially obligated to identify and address any design flaws or operational mechanisms that could negatively impact civic discourse or the electoral process.
The ACM also announced that it will be sending out a questionnaire to each platform, inquiring about their current efforts to protect electoral integrity. If responses are deemed insufficient, follow-up conversations will be requested.
At that time, the ACM had demanded that platforms submit all relevant contact details, including those of the individuals ultimately responsible for Trust and Safety and election-related policies, within one week of receiving the letter. The letter cited the DSA Election Toolkit, the Guidelines for VLOPs and Very Large Online Search Engines (VLOSEs) on mitigating systemic risks, and the Code of Practice on Disinformation – all documents that outline how platforms are expected to handle information during critical democratic events.
The September meeting will focus on how these platforms intend to implement the regulator’s demands, with particular attention to the moderation of user-generated content during the election campaign.
Visit BigTech.news for more stories like this.
Watch this video discussing the expansion of the EU’s censorship state.
This video is from the channel The Prisoner on Brighteon.com.
U.S. State Department slams EU’s Digital Services Act as “Orwellian censorship.”
EU takes aim at Elon Musk’s X with potential $1 billion fine under Digital Services Act.
Sources include:
Tagged Under:
ACM, big government, Censorship, cyber war, deep state, Digital Services Act, election, Facebook, fascism, freedom, Glitch, insanity, Liberty, lunatics, Netherlands, speech police, thought police, TikTok, truth, Tyranny, X
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 OBEY NEWS