Personal Data Protection: GDPR implementation in France

France

France has amended its data protection law to comply with GDPR requirements. Firstly adopted by the National Assembly on 14 May, the wording was then quasi-fully endorsed by the Constitutional Council (Decision 2018-765 DC of 12 June 2018), to be eventually definitely adopted on 20 June 2018 (Law 2018-493 of 20 June 2018).

Its purpose is clear: to adapt the existing applicable law on information technology, data files and civil liberties (Law 78-17 of 6 January 1978, the Law) to the new European legal framework, to ensure its rules are fully applied at domestic level. A certain number of changes which were needed are now implemented, together with national application of Directive 2016/680 dated 27 April 2016 designed to harmonise the criminal law aspects of personal data processing.

With some exceptions, the new provisions have retroactive effect from 25 May 2018, the date on which the GDPR became applicable, and most of the implementing decrees are scheduled for publication in late 2018.

Why do we need a French law?

The GDPR became applicable at domestic level with immediate effect starting on the date it came into force and the adoption of a national law will not affect its validity. That goes without saying. The GDPR nevertheless allows Member States a certain degree of flexibility when it comes to applying several of its provisions. The new law sets out France’s position on these various points.

This has created a complicated situation within French law, whereby the amended Law now has to be interpreted cumulatively with the provisions of the GDPR. Therefore, the fact that certain aspects of the GDPR are not mentioned in the new French law (in particular the rules concerning Data Protection Officers (DPOs) and certain information obligations under the duty of transparency) does not mean that they can be ignored.

As a result, the changes that have been made to the Law in connection with the GDPR may appear numerous and patchy. Their primary purpose is to ensure the Law's ad-hoc compliance with the GDPR in all areas where direct application of the regulation is not enough or when appropriate safeguards are required to protect the rights of data subjects.

One such example is the new possibility for data processors to continue storing data after their initial processing, if required for archiving purposes in the public interest. Likewise, for automated processing resulting in a decision against a data subject, the new law adapts French legislation in order to offer the sufficient and appropriate safeguards required by the GDPR, especially when the processing results in individual administrative decisions. The same also applies to the processing of data relating to criminal convictions, offences and related safety measures, which may now be performed by people working for the public justice system (victim support charities, offender rehabilitation charities etc.), as well as by victims, suspects and those who re-use information from court rulings, subject to certain conditions.

More specifically:

Sensitive data and the continuation of certain formalities

Processing of social security numbers kept in the RNIPP (National Population Register) (Articles 22, 23, 24, 25 and 27 of the Law)

One of the major changes introduced by the GDPR is the principle of accountability of data controllers and processors, in the interests of making the regulation more effective. It has resulted in the removing of most of prior formalities (notification and authorisation procedures) with its national supervisory authority, the CNIL in France. However, there are still some exceptions. In particular, as regards social security numbers (RNIPP registration numbers) which are a particularly identifying type of data, the Law maintains a prior authorisation regime: a Council of State framework decree, enacted following consultation with the CNIL, will define the categories of data controllers authorised to process social security numbers and the permitted purposes of such processing. However, no authorisation is required for the processing of social security numbers performed solely for public interest related purposes, for scientific or historical research purposes or statistical purposes, or for supplying users with one or more online government services, under certain conditions. These purposes do not in fact require such strict regulation, provided that additional safeguards are in place.

Processing of sensitive data (Article 8 of the Law)

Sensitive data may only be processed in certain specific cases permitted by law. The Law has extended the scope of these exceptions. It authorises the processing of data that have been rendered anonymous using a CNIL-approved process. It also permits, in continuation of Law 2016-1321 of 7 October 2016 on the Digital Republic, processing that involves the re-use of public information contained in court rulings and decisions, provided that neither the purpose nor the outcome of such processing is the re-identification of the data subjects. Another exception is the processing by employers and administrative bodies of biometric data strictly necessary for controlling access to workplaces and to equipment and software used by employees for their work. However, this processing must comply with standard rules to be adopted by the CNIL.

Processing of health-related data (Chapter IX of the Law)

Data concerning health may, in principle, be processed for reasons of public interest. The CNIL, in collaboration with the INDS (French National Health Data Institute) has issued standard rules and reference documents for this type of processing. The processing may take place if it complies with these requirements, provided however that the data controllers first submit a declaration of compliance to the CNIL. Any non-compliant processing still requires prior authorisation, in which case the CNIL has two months from receiving the request within which to respond; after this time the request will be deemed to have been approved, subject to exceptions.

These same principles apply to automated processing for the purposes of health-related research or studies and for evaluating or analysing healthcare or prevention practices or activities. Additional safeguards have nevertheless been established. Here, standard rules and reference documents are replaced by approved reference methodologies published by the CNIL and designed in collaboration with the INDS in particular. Whether compliant or not, the processing requires the consent of the data subjects whenever it involves genetic data. Eventually, if the processing does not comply with the methodology, it must be authorised by the CNIL, after first being approved by the competent ethics committee (if the research involves human subjects) or expert review board for health-related research, studies and assessments (if the request relates to studies or assessments or research not involving human subjects).

Consent: clarification and implications

Child consent (Article 7-1 of the Law)

In the interests of greater harmonisation between online child safety requirements and the need to encourage minors to learn how to use digital technologies, the Law has lowered the age of consent to processing defined in the GDPR from 16 to 15. The data controller must obtain consent from the parents and the child if it wishes to collect data from a child below the age of 15.

Freedom of choice for users of electronic devices (Article 28 of law 2018-493)

Consent may only be used as legitimate grounds for the processing if it was given freely. However, there could be doubt as to the freely-given nature of the consent if a data controller, such as a smartphone manufacturer, tried to force the data subject to use a particular app that offered lesser data protection.

Article 28 of law 2018-493 therefore prohibits any restriction on the options of the end user without legitimate technical or safety grounds, especially during the initial configuration of the electronic device. A contract between a software publisher and a hardware manufacturer designed to feature the publisher's software as a priority may not squeeze out the publisher’s competitors and thus reduce data protection levels for end-users. The publisher may still ask for its app to be configured by default, but may not prevent competitors from also offering their software to the user.

Rights of data subjects

Communication of data breaches in cases where reporting the unauthorised disclosure of or access to data could pose a risk to national security, defence or public security (Article 40 of the Law)

One of the rights of data subjects whose data have been processed is to be informed by the data controller of any breach of their personal data, where such breach is likely to result in a high risk to the rights and freedoms of natural persons. However, as permitted by the GDPR (Article 23), the French legislator has overridden this principle in cases where the reporting of unauthorised disclosure or access is likely to pose a risk to national security, defence or public security. Nevertheless, this exception only applies where the processing is required to comply with a legal obligation or where it is necessary to perform a task carried out in the public interest vested in the controller. The aim is essentially to improve ways of tackling people who commit such breaches.

Changes concerning legal remedies

Class action (Article 43-ter of the Law)

An Article 43-ter class action, as introduced by Law 2016-1547 of 18 November 2016, was until now only permitted for claims designed to correct infringements, unlike remedies under consumer and competition law which could potentially lead to an award of compensation. This type of remedy may now be used by a data subject claiming compensation for material or moral prejudice. Such class action must be brought within the framework of the individual compensation procedure established by Law 2016-1547.

Claimant associations (Article 43-quarter of the Law)

As well as making an individual claim before the CNIL or the competent jurisdictions, data subjects now have the right to instruct a claimant association, as described in Article 43-ter(IV), to bring a claim on their behalf against the data controller for a breach. Although not specifically stated in the Law, this measure also applies to criminal cases.

New legal remedy in cases of cross-border transfers of data (Article 43-quinquies of the Law)

Law 2018-493 has introduced a new remedy for cases involving cross-border transfers of data, by incorporating the Schrems ruling (ECJ, 6 October 2015, C-362/14) into French law and thus overcoming the risks posed by infringement claims. If the CNIL believes that a data subject’s allegations concerning a personal data breach are founded, it may now ask the French Council of State to suspend the transfer of data, imposing a fine if necessary, and refer to the ECJ for a preliminary ruling to assess the validity of the European Commission’s decision authorising or approving the necessary appropriate safeguards (adequacy decision or other). However, this remedy may not be used if the data transfer at issue was part of a processing operation performed by a jurisdiction in the performance of its jurisdictional duties.

Now what?

All work thus far has concentrated on dealing with points of conflict and obstacles between national and European law. The focus will now turn to “recodifying” the whole issue in the form of an executive order, in order to produce a clearer picture. Two main goals are pursued: first, to re-write the whole of the Law in the interests of coherence and simplification for data subjects; second, to ensure coherence between all laws and rules governing personal data protection, ensuring they are not only applied in the correct order but also that they are written in a consistent manner. Besides, the forthcoming e-Privacy regulation will introduce yet further changes in certain areas (specifically Article 32-II of the Law on cookies). So watch this space.

Alexandre Ghanty, Anne-Laure Villedieu, CMS Francis Lefebvre Avocats