The rising tide of platform regulation

United KingdomScotland

Covid-19 has brought many things to the fore, not least the role that online platforms play in everyday life. However regulation has struggled to stay ahead of the curve due to the speed at which some platforms have grown. The calls for greater accountability and regulation of online platforms that we have heard in recent years have been accelerated by the pandemic. It is the scale of the major platforms and their prevalence in everyday life (for both individuals and businesses) that have led Governments worldwide to look more closely at how they are regulated and, in particular, to reassess the level of responsibility that they have for the material being made available to their users. Recent political events have also brought into focus the role that online platforms play in curating the news and other material and the impact this can have on wider society (as well as the effectiveness – or otherwise – of some of the existing tools available to counter harmful material and fake news).

From a legal perspective, calls for increased regulation are coming from different angles - we are seeing competition law, IP, consumer protection and data protection laws converge on online platforms. In this article we set out some of the specific reforms (both proposed and already in force) that will impact online platforms. The picture is complex and evolving – and Brexit doesn’t help. However the direction of travel is clear – more regulation, more transparency and greater accountability are inevitable.

Current UK Regulatory Framework for Online Platforms

Whilst a lot of media regulation is focussed on those with editorial control, online platforms are, of course, also already subject to some level of regulation.

Platforms generally have no editorial control, but rather provide the means by which content providers can reach their customers. Therefore, to date, regulation and controls have not focussed on the editorial content itself, but rather the manner in which content is displayed. For example, EPGs are regulated in the UK under the 2003 Communications Act (and elsewhere) and must be licensed by OFCOM.

More recently, the updated AVMS Directive, which has been implemented in the UK, extends regulation to video sharing platforms (VSPs) from November 2020. From 6 May 2021 all VSP providers under the UK’s jurisdiction will need to be notified to OFCOM. A VSP is a service (or a dissociable part of a service) the principal purpose of which is the sharing of videos to the general public, but without such service exercising editorial control over the content itself. Instead VSPs control the manner in which the content is presented.

However the specific regulation that VSPs are subject to is relatively light touch. For example, VSP providers are obliged to take “appropriate steps” to achieve specified goals of consumer protection, such as: (i) protecting minors from content and advertising which might impair their development; and (ii) protecting the public from content and advertising that is criminal or might incite violence or hatred. The “appropriate steps” that VSPs are expected to introduce include parental control systems, mechanisms for users to report harmful content and for that to then be considered and potentially removed, terms and conditions, etc. So a relatively low level of regulation with no direct regulation of the content itself as far as VSPs are concerned. OFCOM will have the power to impose fines for non-compliance of up to £250,000.

However, OFCOM has confirmed that it expects very few platforms to be subject to this regime in the UK under the jurisdiction test (historically defined by reference to the AVMS Directive but, since Brexit, determined for the UK by reference to a new test set out in the Audiovisual Media Services (Amendment) (EU Exit) Regulations 202.) OFCOM has named only two VSPs that it believes are likely to fall within scope. Having said that, this VSP regime looks to be only a temporary fix. As detailed below, the UK’s Online Harms Bill will include much broader obligations and will also apply to services based outside of the UK, but accessible by UK audiences. This will essentially replace the VSP regime.

As regards EPG regulation, changes are also afoot. Post-Brexit, regulation of editorial content services will actually be reduced. From 1 January 2021, linear channels will only require a licence where available on a regulated EPG (and not if they are only made available online). This was a surprising outcome given that the list of “regulated EPGs” for these purposes is fairly limited - there are ten, all traditional TV platform EPGs – whereas the number of access points for viewers of audio-visual content is increasing. However the Government is also sitting on OFCOM’s summer 2019 report on the future regulation of EPGs, which encouraged an expansion in the scope of EPG regulation to cover some of these new access points.

Digital Advertising and Online Platforms

One area of specific regulatory scrutiny in the UK has been from the Competition and Markets Authority (CMA) which has been having a close look at online platforms. In particular it has been focussing on those platforms funded by digital advertising in the context of a market study, with a clear eye to consumer protection and competition issues.

The market study began in July 2019 and concluded in July 2020. Whilst the CMA recognised that online platforms have delivered valuable services for users, limited choice and a lack of competition do generate harms both for consumers, such as higher prices and limited control over the use of their data, and for society at large, for example in terms of fake news and the decline of the press.

The CMA’s response to these findings has been to propose more regulation in the area, which includes implementation of a new regulatory regime targeted at certain online platforms, on the basis that ex-post competition law enforcement was not sufficient. As a result, last year the Government set up the Digital Markets Taskforce to come up with more precise proposals for reform, led by the CMA. These proposals were published in December 2020, with two main focuses: firstly, the introduction of a new regulatory regime; and secondly a reform of consumer protection and competition law.

Underlying all of this is a new regulator, the Digital Markets Unit (DMU), which will sit within the CMA. It will enforce this new regime and have a role in monitoring the wider market as a specialist in digital markets. Digital markets, such as the ad tech market, can be complex from a legal, technical and commercial perspective and the DMU will build up expert knowledge from engaging closely with the market and will aim to anticipate competition issues before they arise.

The DMU will also be responsible for enforcing the new “strategic market status” regime which will apply to certain players in the market that have significant and entrenched market power and influence. So far it appears that only a small number of firms will qualify as having “strategic market status” (SMS) with the DMU proposing to prioritise those with high turnover (e.g. over £1bn in the UK and over £25bn globally). Once designated as having SMS, firms will be subject to three limbs of regulations:

1.

A Code of Conduct

Legally binding, principles-based and tailored to each entity with SMS to tackle the effects of their market power, prevent consumers and businesses from being exploited and to protect competitors from practices which could undermine fair competition. To be enforced by DMU.

2.

Pro-competitive interventions

DMU powers to undertake pro-competitive interventions and implement remedies to address the root of SMS firms’ market power andpromotecompetition, e.g. data mobility, interoperability, data access and also functional separation.

3.

Specific merger rules

Closer scrutiny of transactions involving firms with SMS, including a mandatory (pre-closing) notification requirement and mandatory reporting of all deals entered into by SMS firms.

These proposals for reform have been submitted to the Government which has indicated an intention to publicly consult on how it wishes to move forward in ‘early 2021’ and to establish the DMU in April 2021.

Alongside this, enforcement action is also something to look out for in future months given the information gleaned by the CMA following its market study. Already, the CMA has announced that it will work closely with the ICO and Ofcom under the umbrella of the “Digital Regulation Cooperation Forum” whose priorities for its first year, published in March 2010, include looking at the interrelation between data protection and competition regulation and research into service design frameworks; artificial intelligence; and digital advertising technologies.

Protecting IP Online

The monitoring and the enforcement of IP rights and the responsibility of platforms in this respect has been an ongoing debate for many years and there are a number of current developments which impact this debate. One of these is the Copyright Directive, which is of course now finalised but will not be implemented in the UK. See here for our discussion of the key implications of this legislation. Despite Brexit and the fact that the UK has not implemented the Directive into national law, the Directive is of course still relevant - online platforms by their nature are available across multiple markets and jurisdictions and so compliance is still important (unless content is geo-blocked entirely).

In terms of monitoring and enforcement of infringements relating to online content, the Copyright Directive represents a fundamental shift in the power balance. Under Article 17 the burden is now placed onto the operators of “online content sharing platforms”, which can no longer rely on the E-Commerce Directive’s hosting defence. For these purposes we are talking about those services that provide access to the public of a large amount of copyright protected works or other protected subject matter which are organised and promoted for profit-making purposes.

The platforms included within the legislation will now be liable if a third party uploads copyright material which is owned by somebody else unless it is able to demonstrate that:

  • it has made best efforts to get authorisation from the rights owner to provide the content;
  • it has applied high standards of professional diligence and best efforts to ensure the unavailability of the specific works which a rights owner has provided details of; and
  • it has acted expeditiously once it has received a notice from the rights owner to take the notified works down and made best efforts to prevent the future upload - often referred to a take down and keep down provision, a major change. This final limb is creating some interesting tensions, particularly around the technical means of ensuring content is not re-uploaded and the use of upload filters.

The Directive must be implemented in EU member states by 7th June 2021; some member states have already implemented elements of it, others are in the process of implementation. There will likely be some disharmony in how the Directive is implemented and although the UK will not implement these exact regulations, we will likely see harmonisation by market forces if not by law.

Another issue to consider relates to the economic value of IP and revenue sharing in the online content distribution chain. The EU has been looking at the balance between the original author and those parties who commercialise content by acquiring it, with the debate focussed on fair remuneration for authors (Articles 18-22 of the Copyright Directive). Essentially, what is in the contract between the original author and the distributor may be unpicked to achieve “appropriate and proportionate” remuneration and authors may be able to reacquire rights that are no longer being exploited. Although the UK has decided to not implement the Copyright Directive in the UK, the UK is likely to stay engaged in this debate – not least in the context of the music streaming industry – and because of the strong creative economy we hold.

A final issue is concerned with IP and copyright generally. With greater judicial competition, we will likely see more developments in the use of legal tools to take down and prevent access to unauthorised content and things like blocking injunctions. Brexit might be expected to add a measure of creative judicial tension, with judges within the EU not wanting to be seen to be lagging behind the protections afforded to the creative economy within the English courts. Further it should be remembered that the UK Court of Appeal now has the power to divert from CJEU jurisprudence, for example on issues relating to “communication to the public”. Whilst IP is outside the scope of the Online Harms Bill, and so does not appear to be a legislative priority, it may be an area ripe for UK judicial intervention. How this plays out will be key to understanding the position of online platforms as regards IP enforcement and should be on platforms’ radar.

Consumer Protection and Online Harms

Consumer protection is another important area relevant to the regulation of online platforms, where we are seeing developments in two key areas – a strengthening of consumer law enforcement measures and specific regulation of online harms.

In terms of consumer law enforcement, the key pieces of UK legislation are the Consumer Contracts Regulations, the Consumer Rights Act and the Consumer Protection from Unfair Trading Regulations. The purpose of these rules is to ensure that B2C contract terms are fair, the right information is provided to consumers and they are treated fairly. All of this legislation is derived from EU Directives, but these have been implemented into UK laws. So despite Brexit, the UK is still currently very much aligned with the EU. However, there are big changes on the horizon at the EU level, which is focused on strengthening consumer protection legislation and which, of course, the UK will not necessarily follow.

The most significant change at the EU level is under the Omnibus Directive, which will take effect from May 2022, and introduces fines of at least 4% of annual turnover for serious breaches of consumer protection laws (including those pieces of EU legislation that the UK regime is based on). One of the drivers for the introduction of the Omnibus Directive is that, across the EU there is currently quite a lot of disparity in terms of consumer law enforcement - some member states already have fining regimes for consumer law breaches and others have no fining system at all. In the UK the CMA (with its consumer protection hat on, rather than its competition law hat on) does not currently have the power to fine directly for consumer law breaches. It does though have wide investigatory powers and, more recently, has been given powers to impose “enhanced consumer measures” which can be anything from requiring a trader to put up a notice on its website admitting to its sins, to establishing compensation schemes.

Although the Omnibus Directive will not have to be implemented in the UK, the CMA has been pushing for direct fining powers for some time. The UK Government talked about fines of 10% of turnover as part of a Green Paper published in 2018, but the White Paper for this has not yet materialised. It is not inconceivable that the UK could align itself with the EU position and introduce similar turnover-based fines.

As regards so-called online harms, in December 2020, the UK Government published its full response to the consultation on the 2019 Online Harms White Paper. On the same day the draft EU regulations for the Digital Services Act (“DSA”) and Digital Markets Act were also published. Both the UK Online Harms proposals and the DSA represent a significant move towards making platforms more responsible for the content on their services and an update to the “mere conduit” regime established 20 years ago under the E-Commerce Directive.

In the UK, what is proposed is a new law which imposes a duty of care on platforms towards the users of its services and OFCOM has been named as the regulator for this. Similarly, to the DSA, the UK proposals would apply to those platforms which host user generated content, those that facilitate online user interaction and search engines, regardless of where the platform operator is based. In practical terms, social media platforms, cloud storage sites, video sharing platforms, online forums, dating services, online instant messaging, video games where you can talk to other users and online marketplaces will be caught. However, there are some exemptions, for example for content published on a news publisher’s site and user comments relating to that content, as well as reviews and comments by users on a company’s website..

Obligations under the proposed law include a requirement for platform operators to assess the risk of harm to their users and the wider systems and processes in place to improve user safety. It is not about liability for individual pieces of content. The exact detail around the obligations will be set out in codes of practice which will be drawn up by OFCOM (which is controversial in itself – the regulator defining the scope of legal duties rather than Parliament).

But, similarly, to the DSA, there is likely to be an obligation on companies to provide mechanisms to allow consumers to report harmful content and also to appeal the take down of their content. Companies will also be required to complete regular risk assessments, outlining the risks associated with their services and be seen to be taking reasonable steps to reduce the risks of harms they have identified. User tools, content moderation and recommendation procedures have also been mentioned. “Priority categories” of harms will be set out in secondary legislation and more robust action will be needed to tackle those types of harm, including more pro-active steps to identify and block the content in question.

There are differences between the UK’s and the EU’s approach in this area. At an EU level companies will have to due diligence traders that offer goods and services through their platforms and transparency requirements apply in relation to targeted advertising (for example telling people why they have been targeted).

Interestingly, and mirroring the two tier regime proposed by the CMA in relation to platforms with “strategic market status”, both UK and the EU have proposed additional requirements for the largest platforms. Under the DSA this is for companies with over 45 million users, under the UK’s Online Harms regime, they are referred to as “category 1 services” (but with no test specified as yet – although they will be “high reach, high risk” services). The obligations on larger platforms under the Online Harms regime include spelling out in their T&Cs what content is and is not acceptable and a requirement to enforce those Ts&Cs consistently and transparently as well as a requirement to publish transparency reports outlining what steps the platform is taking. The rationale is to both ensure that effective action is taken as well as ensuring that controversial viewpoints are not arbitrarily removed.

But what kind of content are we talking about here when we say “harmful”? Here there is a significant divergence under the two regimes. The online harms section of the DSA only applies to illegal content, unlike the UK regime which applies to harmful but legal content. Under the proposed Online Harms law:

Who is caught

What content is caught

1.

All companies within scope

Illegal content

2.

All companies within scope

Requirement to assess likelihood of children accessing their platforms – if so, a requirement to ensure children are not exposed to harmful content (e.g. cyberbullying, age inappropriate content, etc.).

2.

Category 1 services

Legal but harmful content - content that gives rise to a risk of harm to individuals or of a significant adverse physical and psychological impact, e.g. fake news, anti-vax content.

But what isnotcaught:

  • IP infringement
  • data protection
  • fraud
  • consumer law
  • cybersecurity

The Online Harms regime does not set out a route for individuals to be compensated directly (although OFCOM will accept super-complaints), but non-compliance can be punished by fines of up to 10% of global annual turnover (under the DSA, it is 6%). The UK is also proposing busines interruption measures, which could include (as a last resort) switching off the service. The Government has also reserved the right to introduce criminal sanctions for senior managers who fail to comply with the regulator’s information requests.

We are expecting the draft Online Harms legislation (Online Safety Bill) sometime this year, and the DSA draft will have to be debated by the European Parliament and Council before being implemented into law. Therefore, there is a chance the UK legislation could be implemented first.

Both the UK’s and EU’s legislative proposals demonstrate a decisive shift in the regulation of online platforms from liability to responsibility, with a real focus on procedures and accountability and threat of serious penalties for those that do not comply.