Katana VentraIP

Digital Services Act

The Digital Services Act Regulation 2022 (EU) 2022/2065 ("DSA") is a regulation in EU law to update the Electronic Commerce Directive 2000 regarding illegal content, transparent advertising, and disinformation.[1][2] It was submitted along with the Digital Markets Act (DMA) by the European Commission to the European Parliament and the Council on 15 December 2020.[3][4] The DSA was prepared by the Executive Vice President of the European Commission for A Europe Fit for the Digital Age Margrethe Vestager and by the European Commissioner for Internal Market Thierry Breton, as members of the Von der Leyen Commission.[5]

Title

Regulation on a Single Market For Digital Services

19 October 2022

COM/2020/825 final

On 22 April 2022, European policymakers reached an agreement on the Digital Services Act.[6][7] The European Parliament approved the DSA along with the Digital Markets Act on 5 July 2022.[8] On 4 October 2022, the European Council gave its final approval to the Regulation on a Digital Services Act.[9] It was published in the Official Journal of the European Union on 19 October 2022. Affected service providers had until 1 January 2024 to comply with its provisions. Popular online platforms and search engines need to comply with their obligations four months after they have been designated as such by the EU Commission.[8]

Objectives of the DSA[edit]

Ursula von der Leyen proposed a "new Digital Services Act", in her 2019 bid for the European Commission's presidency.[10]


The expressed purpose of the DSA is to update the European Union's legal framework for illegal content on intermediaries, in particular by modernising the e-Commerce Directive adopted in 2000. In doing so, the DSA aims to harmonise different national laws in the European Union that have emerged at national level to address illegal content.[1] Most prominent amongst these laws has been the German NetzDG, and similar laws in Austria ("Kommunikationsplattformen-Gesetz") and France ("Loi Avia"). With the adoption of the Digital Services Act at European level, those national laws would be overwritten and would have to be repealed.[11]


In practice, this will mean new legislation regarding illegal content, transparent advertising and disinformation.[2]

New obligations on platform companies[edit]

The DSA is meant to "govern the content moderation practices of social media platforms" and address illegal content.[12] It is organised in five chapters, with the most important chapters regulating the liability exemption of intermediaries (Chapter 2), the obligations on intermediaries (Chapter 3), and the cooperation and enforcement framework between the commission and national authorities (Chapter 4).


The DSA proposal maintains the current rule according to which companies that host others' data become liable when informed that this data is illegal.[12] This so-called "conditional liability exemption" is fundamentally different[13][14] from the broad immunities given to intermediaries under the equivalent rule ("Section 230 CDA") in the United States.


The DSA applies to intermediary service providers that offer their services to users based in the European Union, irrespective of whether the intermediary service provider is established in the European Union.[15]


In addition to the liability exemptions, the DSA would introduce a wide-ranging set of new obligations on platforms, including some that aim to disclose to regulators how their algorithms work, while other obligations would create transparency on how decisions to remove content are taken and on the way advertisers target users. The European Centre for Algorithmic Transparency was created to aid the enforcement of this.[16]


A 16 November 2021 Internet Policy Review listed some of new obligations including mandatory "notice-and-action" requirements, for example, respect fundamental rights, mandatory redress for content removal decisions, and a comprehensive risk management and audit framework.[17]


A December 2020 Time article said that while many of its provisions only apply to platforms which have more than 45 million users in the European Union, the Act could have repercussions beyond Europe. Platforms including Facebook, Twitter, TikTok, and Google's subsidiary YouTube would meet that threshold and be subjected to the new obligations.[18]


Companies that do not comply with the new obligations risk fines of up to 6% on their global annual turnover. In addition, the Commission can apply periodic penalties up to 5% of the average daily worldwide turnover for each day of delay in complying with remedies, interim measures, and commitments. As a last resort measure, if the infringement persists and causes serious harm to users and entails criminal offences involving threat to persons' life or safety, the Commission can request the temporary suspension of the service.[19]

Alibaba AliExpress

Amazon Store

Apple AppStore

Booking.com

Facebook

Google Play

Google Maps

Google Shopping

Instagram

LinkedIn

Pinterest

(added 20 December 2023)

PornHub

Snapchat

(added 20 December 2023)

Stripchat

TikTok

(formerly Twitter)

X

Wikipedia

(added 20 December 2023)

XVideos

YouTube

Zalando

On 23 April 2023, the European Commission named a first list of 19 online platforms that will be required to comply starting 25 August 2023.[20] They include the following very large online platforms (VLOPs) with more than 45 million monthly active users in the EU as of 17 February 2023.[21]


Very Large Online Search Engines (VLOSEs):


Amazon and Zalando both initiated proceedings in the General Court challenging the designations, claiming unequal treatment compared to other large retailers, and that their core business models are retail not distributing third party content/products. Zalando argued the criteria and methodology lack transparency, for instance in how it counts active users, while Amazon said VLOP rules are disproportionate for its business model and asked to be exempted from transparency around targeted ads.[22][23]


As of December 2023, 13 VLOPs have received a request for information (RFI),[19] the procedure necessary to verify compliance with the DSA, and one is being subjected to a formal proceedings.[24] 3 further platforms, all of them providing adult content, were added on 20 December 2023.[25]

Reactions[edit]

Media reactions to the Digital Services Act have been mixed. In January 2022, the editorial board of The Washington Post stated that the U.S. could learn from these rules,[49] while whistleblower Frances Haugen stated that it could set a "gold standard" of regulation worldwide.[50] Tech journalist Casey Newton has argued that the DSA will shape US tech policy.[51] Mike Masnick of Techdirt praised the DSA for ensuring the right to pay for digital services anonymously, but criticised the act for not including provisions that would have required a court order for the removal of illegal content.[52]


Scholars have begun critically examining the Digital Services Act.[53][54] Some academics have expressed concerns that the Digital Services Act might be too rigid and prescribed,[55] excessively focused on individual content decisions or vague risk assessments.[56]


Civil Society organisations such as Electronic Frontier Foundation have called for stronger privacy protections.[57] Human Rights Watch has welcomed the transparency and user remedies but called for an end to abusive surveillance and profiling.[58] Amnesty International has welcomed many aspects of the proposal in terms of fundamental rights balance, but also asked for further restrictions on advertising.[59] Advocacy organisation Avaaz has compared the Digital Services Act to the Paris Agreement for climate change.[60]


Following the 2023 Hamas-led attack on Israel, Thierry Breton wrote public letters to X, Meta Platforms, TikTok, and YouTube on how their platforms complied with the DSA regarding content related to the conflict and upcoming elections. The Atlantic Council's Digital Forensic Research Lab reported that Breton's letters did not follow DSA processes, and digital rights group Access Now criticised Breton's letters for drawing a "false equivalence" between illegal content and disinformation.[61]


Tech companies have repeatedly criticised the heavy burden of the rules and the alleged lack of clarity of the Digital Services Act,[62] and have been accused of lobbying to undermine some of the more far-reaching demands by law-makers, notably on bans for targeted advertising,[63] and a high-profile apology from Sundar Pichai to Breton on leaked plans by Google to lobby against the Digital Services Act.[64]


A bipartisan group of US senators have called the DSA and DMA discriminatory, claiming that the legislation would "focus on regulations on a handful of American companies while failing to regulate similar companies based in Europe, China, Russia and elsewhere."[65][66]


The DSA was mostly welcomed by the European media sector.[67] Due to the influence gatekeepers have in selecting and controlling the visibility of certain journalistic articles over others through their online platforms, the European Federation of Journalists encouraged EU legislators to further increase the transparency of platforms' recommendation systems via the DSA.[68]


Nevertheless, the DSA's later stage inter-institutional negotiations, or Trilogues, have been criticized as lacking transparency and equitable participation.[69] These criticisms mirror past experiences with the drafting of the EU Regulation on Preventing the Dissemination of Terrorist Content Online as well as the General Data Protection Regulation (GDPR).[70]


Swedish MEP Jessica Stegrud argued that the DSA's focus on preventing the spread of disinformation and "harmful content" would undermine freedom of speech.[71]

Digital Markets Act

Trade and Technology Council

Big Tech

Platform economy

Online Streaming Act

WeChat

European Commission: The Digital Services Act

Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act)

on ŒIL

Procedure 2020/0361(COD)