Home » Robotics » EU Warns TikTok Over Potential Breach of Digital Services Act Through Addictive Design Practices

EU Warns TikTok Over Potential Breach of Digital Services Act Through Addictive Design Practices

The European Commission has taken a significant step in enforcing the Digital Services Act (DSA), issuing preliminary findings that TikTok may be in violation of the regulation due to what it describes as “addictive design” features. The assessment, titled “Commission preliminarily finds TikTok’s ‘addictive’ design may breach the Digital Services Act” and published on the Digital Strategy website of the European Commission, marks a crucial escalation in EU efforts to hold major tech platforms accountable for user safety and content moderation.

According to the Commission’s preliminary view, TikTok’s design mechanisms—such as its algorithmically driven, endless-scroll feed and auto-play functionalities—may exploit behavioral vulnerabilities among young users, encouraging excessive screen time without adequate safeguards. These features, the Commission argues, could be reinforcing addictive usage patterns, particularly among minors, potentially infringing on obligations set out in Article 34 of the DSA, which requires large online platforms to mitigate systemic risks.

In response to the preliminary findings, the Commission has formally opened proceedings against TikTok and has given the company the opportunity to defend itself during the investigative process. In particular, authorities have asked TikTok’s parent company, ByteDance, to provide further information on its risk assessments, design choices, and proposed risk mitigation measures. TikTok now has until August 2024 to respond, after which the Commission could impose corrective measures or fines if the company is found to be in breach of the regulation.

The DSA, which came into force in August 2023, aims to create a safer and more accountable online environment, especially for designated “Very Large Online Platforms” (VLOPs) like TikTok, which reach over 45 million users in the EU. These platforms are subject to heightened transparency and duty-of-care obligations, including protecting minors, limiting dissemination of illegal content, and safeguarding fundamental rights online.

Commissioner Thierry Breton has emphasized the importance of protecting users—particularly young and vulnerable audiences—from what regulators describe as potentially manipulative design choices. “Designing platforms to keep users addicted is not acceptable in the EU,” he said in a public statement, noting the DSA’s mandate to address such systemic risks.

TikTok has previously asserted that it takes significant measures to ensure user safety and promote screen-time management, including the introduction of default screen limits for minors and privacy-forward settings. However, the Commission maintains that these features may not sufficiently mitigate the platform’s potential for fostering compulsive use.

This latest development follows similar regulatory scrutiny of major platforms, signifying a new chapter in the EU’s digital governance. Should the Commission’s findings be upheld after the ongoing proceedings, TikTok could face significant penalties, including fines amounting to up to 6 percent of its global annual turnover.

The outcome of the case will likely serve as a key test of the DSA’s efficacy and enforcement capabilities, setting precedents not only for TikTok but for the broader technology sector operating within the European Union.

Leave a Reply

Your email address will not be published. Required fields are marked *