5 things you need to know about the DSA in less than 10 minutes!

Europe

October finished with the long-awaited adoption and publication (on the 27th) of the Digital Services Act (DSA). Allow us to demystify the key points of the DSA for you.

The DSA is a new legal framework for digital services and amends the E-Commerce Directive, which dated back to 2000 and had become out of step with the pace of digital transformation. It applies across the EU.

Here is a brief review of the 5 most important aspects of the new DSA:

1. Big players: very large online platforms and search engines

The DSA contains EU-wide rules for online intermediaries that operate in the EU market (regardless of whether or not they are established in the EU), and distinguishes between three kinds of services:

Very large online platforms and online search engines, with 45 million active monthly users or more, receive special attention as a different category in the DSA. The list of very large online platforms is published in the Official Journal of the EU.

  • Mere conduit (transmission) services, such as local wireless networks, DNS services or domain name registries.
  • Caching (automatic, intermediate and temporary storage) services, like content delivery networks or proxies.
  • Hosting services, including cloud services or web hosting.

2. Measures to counter illegal content online, including illegal goods and services: Notice and action mechanisms and content moderation practices

The DSA does not include a definition of illegal content, but it refers to illegal content as defined in other laws either at EU or national level.

The DSA sets out EU-wide rules that cover detection, flagging and removal of illegal content, such as:(i) notice and action mechanisms; (ii) internal complaint-handling systems,enabling users to lodge complaints against a decision of the provider; and (iii) cooperation with specialised trusted flaggers (recognised independent entities dedicated to detect, identify and notify such content, allowing a ‘fast track’ for these claims) to fight illegal content.

The big players will have to manage systemic risks (e.g. disinformation, manipulation of electoral processes, cyber violence, or harms to minors) and take actions to prevent abuse, such as carry out independent audits of their risk management measures.

The rules on content moderation, advertising, algorithmic processes, and risk mitigation assessments aim to ensure that intermediary services are more accountable.

3. Liability of intermediary services: new version of safe harbour

Intermediary services will not be liable for the content uploaded by users. However, the provider must, upon obtaining actual knowledge or awareness of illegal activities or content, act fast to remove or disable access to that content.

Therefore, the DSA preserves the E-Commerce Directive's liability exclusion regime for intermediary services , and expressly excludes a general monitoring obligation (‘safe harbour’). The difference now is the new regulation imposes greater obligations of transparency and diligence for the notice and action mechanisms. Following the current DMCA-system in the U.S., the individual or entity reporting illegal content must provide a sufficient explanation of why the content might be illegal, a clear indication of the location of that content, and a statement of acting in good faith.

4. Online advertising, profiling and targeting

The DSA introduces two new restrictions: (i) it bans targeted advertising of minors based on profiling and (ii) it bans targeted advertising based on profiling using special categories of personal data, such as sexual orientation, ethnicity, political views or religious beliefs.

Ads on online platforms shall meet the following requirements: (i) ads must be clearly identifiable as such; (ii) information regarding on whose behalf the ad is presented and who paid for it must be included; and (iii) the main parameters used to determine who the ad is targeted at must be accessible. Users will receive more information about ads addressed to them on online platforms, in particular if and why an ad targets them specifically, or when content is sponsored or organically posted on a platform. Users can modify the criteria used by the platform and choose not to receive customised recommendations.

Very large online platforms and search engines will be subject to additional obligations, such as having public ad repositories and the need to assess whether and how their advertising systems are manipulated or contribute to societal risks. They will need to takemeasures to mitigate against these risks.

5. Competent authorities, and fines

The EU and national authorities will work together, supervising how online intermediaries adapt their systems to the new requirements. Each Member State will appoint a Digital Services Coordinator to supervise and impose the penalties that will be established under national law. For big players, the Commission will have direct supervision and enforcement powers, imposing in the most serious cases fines of up to 6% of the global turnover.

Additionally, immediate actions can be adopted, if necessary, to address very serious harms, namely:

  • A restriction on the visibility of specific information, including the removal, disabling access or demotion of content;
  • a suspension, termination or restriction of payments;
  • a suspension or termination of the service; or
  • a suspension or termination of the account of the (alleged) infringer.

Reach out if you need help

The DSA willenter into force on the twentieth day following its publication but will only apply (at least most of its provisions) from 17 February 2024. The Technology, Media and Communications team at CMS includes a specialist Digital Business subgroup which advises on the impact of digital regulation, including the DSA. Please reach out to your usual CMS contact for an introduction.

Article co-authored by Grace Ang-Lygate, Senior Business Development Manager at CMS.

#digihub