Please enable javascript in your browser to view this site

Switzerland: Copycatting the DSA

Though the proposed legislation reflects core features of the EU law, the Swiss Government has not incorporated measures to protect children online, bucking global trends

The Federal Council has unveiled its proposed online safety law

On 29 October 2025, the Swiss Federal Council opened a consultation on the Federal Act on Communication Platforms and Search Engines, its proposed online safety law. In its announcement, the Federal Council named Alphabet, Meta, TikTok and X as large platforms that are used to spreading illegal content and that answer only to self-defined rules for content moderation. The bill aims to hold these platforms accountable for the spread of illegal content online and to empower users to exercise their rights in relation to content moderation policies. The proposed legislation follows on from the Federal Council’s April 2023 direction to the Department of the Environment, Transport, Energy and Communications (DETEC) to develop draft legislation based, where appropriate, on the EU’s Digital Services Act (DSA). The consultation will remain open for comment until 16 February 2026.

The bill directly lifts many of the key governance features of the DSA

If adopted, the legislation would only impact platforms with a Swiss user base greater than 10% of the population (approximately 900,000 people) and that either function to share user-generated content or as a search engine. These platforms would be subject to a series of transparency and governance obligations broadly aligned with the DSA, including the development of an alternative dispute resolution scheme, maintenance of a publicly accessible ads archive and annual transparency reporting and risk assessments. Most significantly, and again in line with the DSA, regulated platforms would be required to offer a reporting procedure for users to report potentially illegal content as defined under the Criminal Code, including: depictions of violence, defamation, slander, insults, threats, coercion, sexual harassment, incitements to violence and discrimination and incitements of hatred. If a user’s content is found to be illegal or otherwise in violation of a platform’s terms of service, the platform must alert that user of the action taken to moderate access to that content, including removal from the service, downgrading within a recommender system, demonetisation and the suspension or termination of that user’s account. The Federal Office of Communications (OFCOM) would serve as the primary regulatory authority for the law, collecting supervisory fees from regulated platforms using the same cap (0.05% of worldwide revenue) as the DSA. OFCOM would also be empowered to issue maximum fines up to 6% of a platform’s worldwide revenue in the event of non-compliance, again in line with the DSA. 

The tightly limited scope of the law would restrict response to emerging harms

Despite the many elements of the DSA that DETEC appears to have mirrored in its proposed legislation, the Swiss Government has nonetheless tightly limited the scope of the bill to preventing the spread of illegal content online. Bucking the European and wider global trend in concern for the safety of minors online, the bill makes no mention of moderating content harmful to young users, the implementation of age assurance methods or other common interventions taken up by other governments in recent months to make online spaces safer for children. The legislation also does not provide additional scope for the Federal Council or OFCOM to develop codes of conduct. Under the DSA, the EC has used its ability to develop codes of conduct under Article 45 to incorporate additional, albeit voluntary, measures on disinformation and illegal hate speech into the wider regulatory framework. While some countries, including the UK, have amended criminal statutes to address emerging harms to be captured by online safety laws, the largely procedural focus of the proposed Swiss law in combination with the apparent inability of the Government or regulator to address new concerns make the proposed legislation both a less interventionist framework and potentially static in the face of quickly evolving platform markets.