Digital Fairness Act: protecting consumers from unethical techniques and commercial practices

The way consumers interact with online platforms is changing rapidly. Recognising this shift, the European Commission has already introduced key legislation like the Digital Markets Act (DMA) and Digital Services Act (DSA) to promote fair competition and secure digital spaces for users. However, stronger protections are necessary with the rise of manipulative digital tactics and the increasing threats to consumer autonomy. Building upon existing EU consumer laws, the European Commission is expected to present a new Digital Fairness Act. This piece of legislation aims to combat a range of harmful online tactics, from dark patterns and addictive design features to exploitative targeted advertising practices.

While the Act is yet to be formally introduced, insights from the recent Digital Fairness Fitness Check offer a preview of what is to come. Curious about the latest details on this upcoming legislation? Read on to learn more!

What is the upcoming Digital Fairness Act?

The expected Digital Fairness Act likely seeks to strengthen European online consumer protection rules by tackling unethical techniques and commercial practices and strengthening consumer rights.

More specifically, the Act will likely focus on:

  • Dark patterns in online interfaces, which could influence consumer decision making (more explanation on dark patterns below);
  • Addictive digital service designs, which lead consumers to spend more time and money on services;
  • Exploitative targeted ads and online profiling, which could take advantage of consumer vulnerabilities for commercial purposes;
  • Difficult subscription cancellation mechanisms;
  • Problematic commercial practices and marketing by social media influencers.

 

The beginning: the New Consumer Agenda

The story begins on 13 November 2020, when the European Commission published their New Consumer Agenda outlining a vision for EU consumer policy from 2020-2025. It tackled, among other things, the digital transformation and the enforcement of consumer rights. For example, the Agenda notes that “commercial practices that disregard consumers’ right to make an informed choice, abuse their behavioural biases, or distort their decision-making processes, must be tackled”.

Such practices comprise the use of ‘dark patterns’, which are user-interface designs aimed at manipulating consumers. Moreover, they include certain personalisation practices based on profiling, hidden advertising, fraud, false or misleading information and manipulated consumer reviews. The document highlights that, after updating the guidance documents on the applicability of consumer law instruments like the Unfair Commercial Practice Directive and Consumer Rights Directive, the Commission plans to analyse whether additional legislation or other actions are needed.

Within the Council Conclusions on the New Consumer Agenda, Member States called for a clear responsibility and liability framework for online platforms. This includes the adoption of appropriate measures to efficiently address fraudulent, unfair and misleading commercial practices and the sale of non-compliant and dangerous goods and services through online platforms. Additionally, they underline the importance of identifying illegal online commercial practices.

 

Parliament’s resolution on addictive design of online services

On 12 December 2023, the European Parliament adopted the ‘Resolution on addictive design of online services and consumer protection in the EU Single Market’. This resolution further shaped the legislative and political debate around manipulative online designs and practices.

The resolution highlighted the risks posed by manipulative and addictive digital practices, such as infinite scrolling, autoplay, and hyper-personalised recommendations. These designs exploit psychological vulnerabilities to maximise user engagement, which could lead to significant mental health and societal impacts. Young adults and children are particularly vulnerable. The resolution called, among other things, for the need for additional consumer protections regarding addictive design on digital platforms.

 

Digital Fitness Check

The European Commission launched a public consultation in November 2022 to review the digital fairness of existing EU consumer laws: the Unfair Commercial Practices Directive, the Consumer Rights Directive, and the Unfair Contract Terms Directive.

Afterwards, the Commission gathered all the input into a final Digital Fairness Fitness Check report on 3 October 2024.

The report highlights the emergence of new challenges such as dark patterns, data-driven manipulation, and personalised advertising, which undermine consumer choice and trust. While the current framework provides a foundational level of protection, gaps in enforcement, legal uncertainty, and the inadequacy of existing laws to address newer technologies like AI and algorithmic decision-making are significant concerns. The evaluation suggests targeted updates to ensure effective protection and legal coherence across Member States.

The report also notes that the fragmentation of national laws, insufficient enforcement, legal uncertainty, and a lack of incentives all act as barriers for businesses seeking to adopt higher standards of protection. Accompanied by the report were also studies to support the Fitness Check.

The Commission states that they will propose specific measures, with “reflections towards a Digital Fairness Act” to tackle the addressed challenges of the Digital Fairness Check. It is important to note that the report is an evaluation and not a legislative proposal. It does not establish recommendations on the format and content of future Commission action.

What is the status of the Digital Fairness Act?

Commission President Ursula von der Leyen tasked the Commissioner for Democracy, Justice and Rule of Law, Michael McGrath (Ireland) in his mission letter to work on a Digital Fairness Act. This legislation should “tackle unethical techniques and commercial practices related to dark patterns, marketing by social media influencers, the addictive design of digital products and online profiling, especially when consumer vulnerabilities are exploited for commercial purposes.”

McGrath is also responsible for developing the Consumer Agenda for 2025-2030, focusing on protecting consumers and assessing the need to update current legislation and ensuring enforcement. Additionally, he will be working on a new Action Plan on consumers in the Single Market to tackle unfair discrimination against consumers, ensuring benefits when they cross borders and are protected when buying goods or services.

When asked about the upcoming Digital Fairness Act during the Commissioners-designate hearings, Executive Vice-President for Tech Sovereignty, Security and Democracy Virkkunen (Finland) replied that the Digital Services Act (DSA) is central to the protection of minors and the fight against the addictive nature of online services. She underlined that the need for a new text will therefore have to be proven. She defended the line taken by the DG Connect, which guards these issues by applying the DSA, whereas DG Justice wants to approach them from the angle of consumer protection, via the Digital Fairness Act.

What are the next steps?

On 27 November, the European Parliament approved the new College of Commissioners led by President von der Leyen with a simple majority. The European Commission kicked off its activities as of 1 December. The Commission’s Work Programme, which outlines the (non-)legislative proposals of the year to come, is expected to be published around mid-February. Major challenges lie ahead for Commissioner McGrath to complement the Digital Services Act and Digital Markets Act in order to protect EU consumers from unethical techniques and commercial practices.

The Digital Fairness Act has been front and centre in the answers from McGrath to the written questions of the Parliamentary committees and during the hearing itself. However, the European Commission is not expected to publish the Digital Fairness Act already next year. Instead, 2025 will include a call for public consultation, which will run for 12 weeks, alongside the development of an impact assessment of specific policy options. This legislation will likely extend into the following year (2026) before a fully-fledged proposal is presented. A deliberate approach is being taken to ensure the new Act does not duplicate existing legal frameworks.

The focus areas for the Digital Fairness Act include the role of influencers, dark patterns, addictive design, and unfair personalisation. Current practices that are being scrutinised include those impacting online shoppers, such as the disproportionate ease of signing up for subscriptions compared to the difficulty of cancelling them. There is also a growing emphasis on enhancing the EU’s ability to enforce consumer law. Proposals suggest granting the European Commission centralised investigative and enforcement powers in certain areas to strengthen consumer protections.

Do you want to know more?

Do you need help getting a better understanding of the Digital Fairness Act and its impact on your organisation?

Fill out the form below and our team of experts will get in touch with you.

    * required field