London - Paris
Data@datarainbow.eu

Google Subject to the Biggest Fine Under GDPR

Google Subject to the Biggest Fine Under GDPR

In what is today the biggest fine under the GDPR, the French Data Protection Authority CNIL investigated Google Android transparency in Data processing. The investigation was consecutive to two class actions filed on the 25th of May 2018. One by the Austrian NGO Non Of Your Business (NOYB), another by the French La Quadrature du Net (LQDN). The complained have been partly answered as limited to the mobile Android Operating System. Still remain Youtube, Gmail and Google Search. This decision worth a close look up.

In its complaint, NOYB stated in particular that users of Android mobile devices are required to accept Google’s privacy policy and general terms and conditions of service and that, failing such acceptance, they could not use their terminal.

The association LQDN complained that, regardless of the terminal used, Google does not have valid legal bases to implement the processing of personal data for purposes of behaviour analysis and advertising targeting.

First of all, the CNIL has shown high celerity to deal with the complaints.

  • On 1st June 2018, the CNIL submitted the complaints to its European counterparts via the European information exchange system with a view to the designation of a possible lead authority in accordance with the provisions of Article 56 of the GDPR.
  • 20 September 2018 : Decision to investigate the Android’s data procession is taken;
  • on 21 September, an online test was carried out to verify the compliance of any processing relating to the use of the operating system Android for mobile equipment, including the creation of a Google account.
  • 24 and 25th September, the report was notified to GOOGLE LLC companies. and Google France SARL
  • 28 September 28, the two companies also received the aforementioned complaints by courier of the CNIL
  • 2nd Octobre, Mr. François PELLEGRINI was appointed as rapporteur according to the article 47 French Data Protection law of January 6, 1978
  • 22nd October, the reporter notified Google LLC and Google France a report detailing shortcomings relating to sections 6, 12 and 13 of the GDPR. The report suggested a fine of 50 million euros to be made public. 
  • 7 November 2018, Google requested a hearing from the rapporteur,
  • 13 November the request was rejected .
  • On the same date, the company also made a request for closed doors and postponement of the meeting,
  • 15 November the request was rejected.
  • 22nd November, Google filed a written comments on the report. 
  • 7th December, the rapporteur replied to these observations.
  • 11th December, Google, which had fifteen days from receipt of the rapporteur’s to reply, requested to postpone the meeting and extend the deadline to produce new observations. 
  • 13th December , the request was accepted to postpone for two weeks – until January 7 – the deadline for filing these observations and to postpone the meeting to January 15, 2019.
  • 4th January, Google filed new observations in response to those of the rapporteur.
  • All observations were reiterated orally during the meeting of 15 January 2019.
  • 21st January the decision was made public.

On the decision itself, Google argued the French CNIL should not be the competent Data Protection Authority.

Based on Article 55 (1) of the GDPR states: Each supervisory authority shall be competent to exercise the tasks and powers vested in it in accordance with this Regulation in the territory of the Member State to which it belongs.

Article 56 (1) of the GDPR provides: Without prejudice to Article 55, the supervisory authority of the principal place of business or of the sole place of business of the controller or processor shall have jurisdiction to act as lead supervisory authority for cross-border processing by that controller or processor, in accordance with the procedure laid down in Article 60.

Google argued that the CNIL could not be competent to conduct this procedure and that it should have transmitted the complaints received to the Irish Data Protection Commission (DPC) the lead authority to deal with these cross-border complaints in accordance with Art 60 GDPR. Google Ireland Limited being its main establishment in the EU for some of the its cross-border processing, including those objects of the complaints.

Therefore, initial discussions turned around the competent authority as defined by the main establishment of the company. Google arguing Ireland to be its main establishment, the CNIL responding the main Administrative establishment is not the criteria. Based on Article 4 (16) of the RGPD and Recital 36 GDPR, the main establishment to consider should be based on objective criteria and should imply the effective and real exercise of management activities determining the main decisions as to the purposes and means of processing trough stable arrangements. The CNIL added that the determination should be analysed in concreto, according to objective criteria. The main establishment could therefore not necessarily correspond to the main designated administrative establishment. The CNIL based its position on the WP article 29 guidance WP244 of 5th April 2017 “the central administration is the place where the decisions are made on the purposes and means of processing.

Deciding otherwise would result on forum shopping, says the CNIL. Google Ireland had no decisional power. Google Ireland Limited is not mentioned in the Company’s Privacy Policy dated May 25, 2018 as the entity where the main decisions about the purposes and means of the processing are made. covered by the privacy policy presented to the user when creating his account, during the configuration of his mobile phone on Android. In fact Google had not either named any DPO in Ireland who would be in charge of the processing of personal data that it could implement in the European Union. It further notes that the Android operating system is developed solely by Google LLC.. Additionally, the company itself had indicated, by mail dated December 3, 2018 addressed to the DPC, that the transfer of responsibility of Google LLC. to Google Ireland Limited on certain processing of personal data concerning European citizens would be finalized on 31 January 2019. It later specified to update its confidentiality rules which will come into effect on 22 January 2019.

Google’s argument that a reorganisation both operational and organizational was underway to make the company Google Ireland Limited the controller for certain processing of personal data concerning European nationals, was in my view a mistake only proving that at the time of the compliant Google Ireland had no decisional power.

It also considers that the definition of principal place of business should be distinguished from that of controller. Arguing that if the European legislator had intended that the concept of principal place of business should be interpreted as the place where decisions concerning ‘would have expressly indicated simply ignored the text of the Recital 36.

In the absence of a principal establishment allowing the identification of a lead authority, the one stop shop did not apply, and the CNIL was competent to initiate this procedure and to exercise all of its powers under Article 58 of the GDPR.

This is a very interesting analysis to determine the main establishment and the lead authority based on objective criteria and not a subjective decision favouring forum shopping. The decision dismisses the application of the GDPR’s one-stop-shop by holding that Google Ireland Limited is not Google’s main establishment in the EU. A rather strict interpretation of the concept of “main establishment”. With no main establishment in the EU, Google LLC could potentially be subject to enforcement by any supervisory authority in the EU where Google has an establishment, including France. 

Once the CNIL confirmed its competence, it goes into a detailed analysis of the transparency requirement and legitimate basis of processing. here after, a free translation is offered:

2- On the Procedure

Google argued on the validity of the complaints.  The CNIL responded that the question of the admissibility of the complaints could not affect the legality of the proceedings, as it could self-refer itself based on the findings made by its services. The CNIL had therefore a mission to supervise the application of the regulation and to ensure compliance with it and that it has, for this purpose, the power to carry out investigations, in accordance with Arti 57 1. a) and h) of the RGPD. In fact, the two associations had been mandated to represent the complaints under article 80 of the RGPD.

The second argument claiming that the proceedings breached its right to a fair trial according to Art 6 of the Convention for the Protection of Human Rights and Fundamental Freedoms that the rapport to his observations were addressed in French, sounded rather difficult to succeed. The CNIL responded to that any notification of a penalty report had to be in French based on a legal obligation laid down in Article 111-1 of the Code of relations between the public and the administration which provides that use of the French language is prescribed in the exchanges between the public and the administration, in accordance with the provisions of the law n ° 94-665 of August 4, 1994 relating to the use of the French language. Google has an establishment on the French territory with several hundred employees. Google had in any event sufficient material and human resources enabling it to ensure a translation of the documents in English in sufficient time to become acquainted with it and to make observations within the deadline.

In view of these elements, the Restricted Training considers that the company had in any event sufficient material and human resources enabling it to ensure a translation of the documents in English in sufficient time to become acquainted with it and to make observations in the the deadline fixed for it.

This point explains why it is important that the Organisations’ representative speak the language of the DPA.

Also the refusal to extend the time limit he was opposed to produce his first observations and limited the time available to prepare his defence. It also considers that the adjournment of the meeting as well as the additional time it was finally granted to produce its second observations were still not sufficient.

3.Over the scope of investigations

Google argued first of all that the rapporteur has confused the Android operating system and the Google account when they are separate services that implement different treatment activities. Then argued that the scope of control chosen by the CNIL – namely the creation of a Google account when setting up a new device using the Android operating system – is limited in that it only represents a case figure affecting 7% of users. Finally, it indicated that the findings were made on an older version of the Android operating system.

The CNIL did not disputed the existence of separate services, respectively related to the Android operating system and the Google Account, implementing different processing activities.

It observed, however, that the facts covered by the investigations corresponded to the scenario chosen to carry out the on-line check, namely the journey of a user and the documents to which he could have accessed during the initial configuration of his mobile equipment using the Android operating system. This path included the creation of an account. These facts therefore related to the treatments covered by the privacy policy presented to the user when creating an account during the setting up of their mobile phone on Android.

Then, adding that if it was true that the user actually had the choice to create an account and has the opportunity to use some of the services without having to create an account, however, when setting up an Android device, the ability to create a Google account or connect to an existing account naturally appeared at the beginning of the setting process, without specific action of the user.

In any case, under Article 11.I.2 of the Data Protection Act, the CNIL has a broad discretion as to the scope of the controls it may undertake. A specific control scenario, such as that used in this case, may make it possible to make findings reflecting a more comprehensive privacy policy. The scope of the processing of personal data that is performed is therefore is irrelevant so is the version of the Android operating system since it appears from the documents provided by Google that the user’s journey is similar in the newer version .

4.On breach of transparency and information obligations

Google infringed the GDPR for three main reasons: (1) lack of transparency (art. 5 GDPR); (2) insufficient information (art. 12 and 13 GDPR); and (3) invalid consent collection or lawfulness basis (art. 7 GDPR). The CNIL found the information Google provides when creating an account is not clearly accessible and therefore consent is not ‘informed’ and explicitly obtained. Criticising pre-ticked boxes explicitly prohibited by the GDPR.

The obligations of transparency resulted on Article 12 (1) GDPR that provides: “1.The controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 and any communication under Articles 15 to 22 and 34 relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child. The information shall be provided in writing, or by other means, including, where appropriate, by electronic means. When requested by the data subject, the information may be provided orally, provided that the identity of the data subject is proven by other means.” 

Article 13 (1) provides that: “1. Where personal data relating to a data subject are collected from the data subject, the controller shall, at the time when personal data are obtained, provide the data subject with all of the following information:

(a)    the identity and the contact details of the controller and, where applicable, of the controller’s representative;

(b)    the contact details of the data protection officer, where applicable;

(c)    the purposes of the processing for which the personal data are intended as well as the legal basis for the processing;

(d)    where the processing is based on point (f) of Article 6(1), the legitimate interests pursued by the controller or by a third party;

(e)    the recipients or categories of recipients of the personal data, if any;

(f)    where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.

Google firstly considered that the Confidentiality rules and conditions of use, accessible when creating an account, constitutes first-level information in accordance with the EDPS Guidelines on Transparency within the meaning of the EU Regulation 2016 / 679 (WP260) of 25 May 2018. It states that the document provides a good overview of the treatments implemented and that the mention of the legal basis of these treatments does not have to be included in this first level information. Information about how long the data is kept is in the section Exporting and Deleting the Information in the Privacy Policy.

Google goes on to argue that the information of individuals must, in the light of sections 12 and 13 of the Regulations, be assessed on a global basis. In this regard, it states that the information it issues is also used, in addition to the documents entitled Confidentiality Rules and Conditions of Use, Privacy Policy and Terms of Use, through several other terms and conditions. Additional information messages may appear when creating an account under each of the privacy settings. In addition, an e-mail is sent to the user when an account is created, stating that: it can change the privacy and security settings of the Google account at any time, create reminders to check the privacy or security settings. This e-mail contains clickable links to various setting tools.

These other control tools, which are made available to the user after the creation of his account from the management interface of his account include for example a tool called check-up confidentiality that allows users to choose privacy settings that suit them, including custom ads, location history, web-based activity, and apps.

Google also puts forward a Dashboard tool that allows users to have an overview of the use they make of the services offered by Google such as Gmail or Youtube.

Finally, Google reminds that when a user clicks Create Account without having disabled custom ad settings, an account creation confirmation pop-up appears to remind that the account is set to include customisation features. The company indicates that the user journey is thus configured to slow down the progression of users who would not have spontaneously made the choice of more privacy-friendly settings.

The CNIL takes note of the progress made in recent years by the company in its policy of informing users, in the sense of greater transparency and greater control over their data expected by them. For the reasons that follow, however, it considers that the requirements of the GDPR, the implementation of which must be assessed in the light of the concrete scope of the processing of personal data in question, are not respected.

Firstly, the CNIL recalls that pursuant to the provisions of Art 12 of the Regulation, information must be provided in an easily accessible way. This accessibility requirement is informed by the transparency guidelines, in which the EDPS considered that a key aspect of the transparency principle highlighted in these provisions is that the data subject should be able to advance what the scope and consequences of the processing encompass so as not to be caught off guard at a later stage as to how its personal data were used. […] In particular, with regard to complex, technical or non-planned data processing, G29’s position is that […] controllers should separately and clearly define the main consequences of treatment: in other words, what will actually be the effect of the specific treatment described in a statement or opinion on the protection of the privacy for the data subject. The CNIL also recalls that the accessibility requirement of Article 12 is based in part on the ergonomic choices made by the controller.

In this case, the CNIL found that the general architecture of the information chosen by the Corporation does not meet the requirements of the Regulations. The information that must be disclosed to individuals pursuant to section 13 is excessively scattered in several documents: Privacy Policy and Terms of Use, displayed during the creation of the account, then Terms of Use and Rules of confidentiality which are accessible in a second time by means of clickable links appearing on the first document. These different documents include buttons and links that must be activated to learn additional information. Such an ergonomic choice leads to a fragmentation of information thus forcing the user to multiply the clicks necessary to access the different documents. The user must then carefully consult a large amount of information before he or she can identify the relevant paragraph (s). However, the work provided by the user does not stop there since it will still have to cross-check and compare the collected information in order to understand which data are collected according to the various settings that he may have chosen.

The CNIL noted that, given this architecture, some information is difficult to find. For example, with regard to targeted advertising, to know the information that is collected, a user must perform many actions and combine several documentary resources. As a first step, you should read the General Privacy Policy and Terms of Service, then click the More Options button and then the Read more link to display the Customise Ads page. He will thus have access to a first description of the treatment relating to the personalisation of the advertisement which proves to be incomplete. To complete the information relating to the data processed for this purpose,

Similarly, in terms of geolocation data processing, the CNIL noted that the same course devoid of any intuitive character is required from the user with regard to information relating to geolocation data. The user will have to complete the following steps: Review the Privacy Policy and Terms of Service, click More Options, and then click the Learn more link to view the Location History page and view the displayed text. However, as this text is only a short description of the processing, the user will need to go to the Privacy Policy document and access the Information about your location to access the rest of the information.

In the two cases described, five actions are necessary for the user to access the information relating to the targeted advertisements and six for the geolocation.

The CNIL stated that if the user wished to have information on the retention periods of his personal data, he must first consult the Privacy Policy found in the main document and then go to the section titled Export and delete the information and finally click on the hyperlink click here contained in a general paragraph on storage times. It is therefore only after four clicks that the user accesses this information. The Restricted Training further notes that the title chosen by the company for Exporting and Deleting Your Information does not easily allow the user to understand that this is a topic to access information about retention periods . 

As a result of all these factors, there is an overall lack of accessibility of the information provided by the company in the context of the processing in question.

Secondly, the CNIL considered that the clear and comprehensible nature of the information provided, as required by Article 12 of the GDPR, must be assessed taking into account the nature of each treatment in question and its concrete impact on the persons concerned. concerned.

Beforehand, the CNIL considered it was essential to emphasise that the data processing implemented by the controller was particularly massive and intrusive.

The data Google collects comes from a wide variety of sources. This data is collected both from the use of the phone, the use of the company services, such as the Gmail email service or the Youtube video platform, but also from the data generated by the user activity when they visit third-party sites that use Google services, including Google Analytics cookies placed on these sites.

As such, the Privacy Policy reveals that at least twenty services offered by the company are likely to be involved in the processing, which may concern data such as web browsing history, history from applications, data stored locally on the equipment (such as address books), geolocation of equipment, etc. Therefore, a lot of data is processed as part of these services via or in connection with the Android operating system.

The record shows that, in addition to data from external sources, the company treats at least three categories of data:

  • data produced by the user (for example, their name, password, phone number, email address, payment method, content created, imported or received, such as writings, photos or videos );
  • data generated by its activities (for example, IP address, unique user credentials, mobile network data, data related to wireless networks and Bluetooth devices, timestamp of actions performed, data geolocation, the technical data of the devices used including data relating to the sensors (accelerometer, etc.), the videos viewed, the searches made, the browsing history, the purchases, the applications used, etc .;
  • Derived or inferred data from data provided by the user or his activities. In this category, the Privacy Policy lists a number of purposes that can only be accomplished by generating data from the other two categories of data. Thus, the customisation of the ads that the company realises requires to infer the interests of users from their activity in order to offer them to advertisers. In the same way, the purposes of providing content, research and personalised recommendations require inferring new information from those declared, produced or generated by the activity of the user.

Moreover, if the very large number of processed data can characterise alone the massive and intrusive nature of the treatments performed, the very nature of some of the data described, such as geolocation data or content consulted, reinforces this finding. Considered in isolation, the collection of each of these data is likely to reveal with a high degree of precision many of the most intimate aspects of people’s lives, including their lifestyle, their tastes, their contacts, their opinions or their trips. The result of the combination of these data greatly reinforces the massive and intrusive nature of the processing.

Consequently, it is in the light of the particular characteristics of these processing of personal data which have just been recalled that the characters are clear and understandable, within the meaning of Art 12 of the GDPR, of the information provided for in Art 13 of the Regulations, must be appreciated. The CNIL considered that these requirements were not respected in this case.

Specifically, the CNIL noted that the information provided by Google did not allow the users to understand sufficiently the particular consequences of the processing.

In fact, the purposes announced in the various documents are described as follows: offer personalised services in terms of content and advertisements, ensure the safety of products and services, provide and develop services, etc. . They are too generic in terms of the scope of the treatments implemented and their consequences. This is also the case when users are told too loosely: The information we collect is used to improve the services offered to all our users. […] The information we collect and how we use it depends on how you use our services and how you manage your privacy settings.

Therefore, the CNIL noted that the description of the aims pursued did not allow users to measure the extent of the treatments and the degree of intrusion into their private lives that they are likely to take away. It considered, in particular, that such information is not provided in a clear manner, nor at the first level of information provided to users through, in this case, the document entitled Confidentiality rules and conditions of use nor in the other levels of information proposed by the company.

The CNIL further noted that the description of the data collected, which could be of such a nature as to clarify the scope of these purposes and to prevent the user from being subsequently caught off guard as to the way in which his data were used and combined, is particularly imprecise and incomplete, both in the analysis of the first level of information and that of the other documents provided.

For example, the Privacy Policy and Terms of Use document and the Privacy Policy document specify: This may be more complex (…) information, such as the ads you find most useful, the people you care about. interest most on the web or YouTube videos that are likely to interest you.

In view of the foregoing, the CNIL believed that the user is not able, especially by being aware of the first level of information presented to him in the Privacy Policy and Terms of Use, to measure the scope of the main treatments on his private life. While it noted that exhaustive information, from the first level, would be counterproductive and does not respect the requirement of transparency, it considers that it should contain terms that would objectify the number and scope of implemented. It further considered that it would be possible, by other types of presentation methods adapted to data combining services,

The finding of lack of clarity and comprehensibility must also be made with regard to the mention of the legal basis of targeted advertising . In fact, the company first states in the Privacy Policy: We ask you for permission to process your information for specific purposes, and you are free to revoke your consent at any time. For example, we ask you for permission to provide personalised services, such as ads. The legal basis chosen here therefore appears to be consent. However, the company further adds it relies on the legitimate interest,

The CNIL stressed that if Google had indicated before that the only legal basis for the processing of the targeted advertisement was consent, the investigation shows that this clarification is not brought to the attention of the users. The formulations recalled above did not allow the latter to clearly measure the distinction between properly targeted advertising, based on the combination of multiple user data, which is based on the company’s consent statement , other forms of targeting using for example the context of navigation, based on the legitimate interest. The CNIL stressed the particular importance of the need for clarity in these processing.

With respect to information on the retention period, the CNIL noted that the How are Google’s collected information page is maintained has four categories:

  • Information retained until the user deleted it;
  • Information with a timeout;
  • Information retained until user’s Google Account is deleted;
  • Information kept for long periods of time for specific reasons.

It noted, however, that as regards to the latter category, only very general explanations of the purpose of this retention are provided and no precise duration or the criteria used to determine that duration are indicated. This information is among those that must be issued to users pursuant to Article 13 (2) (a) of the Regulations.

Finally, if the company claims that multiple information tools are made available to users concomitantly and after the creation of their account, the restricted training notes that these methods do not make it possible to reach the requirements of transparency and transparency. information from articles 12 and 13 of the GDPR.

First, the CNIL noted that the tools to which the company refers do contribute, to a certain extent, to the goal of transparency throughout the life of the account and the use of Google’s services. However, it considered that they do not participate sufficiently in the information provided for in Article 13, which must be provided at the time data are obtained. As mentioned in the EDPS Transparency Guidelines, Article 13 specifies the information to be provided to data subjects from the start of the processing cycle.

If any data other than those strictly necessary to create the account, are collected throughout the life of the account, such as browsing history or purchases, the time of its creation marks the entry of the user in the ecosystem of Google services, whose particularly massive and intrusive nature of processing has been recalled previously. This step marks the beginning of a multitude of treatment operations: collection, combination, analysis, etc. Therefore, since the process of creating the account is essential in the understanding of treatments and their impact and where the proposed user journey itself invites the person concerned to focus his attention at this stage, the information provided for in Rule 13 of the Rules of Procedure at that time must.

Moreover, both the pop-up window appearing at the time of the creation of the account and the electronic message sent as soon as the account is created contain only a summary or very specific information on the processing implemented and can not allow to consier the prior information as sufficient.

In fact, the text of the pop-up window indicates This Google Account is configured to include customisation features (such as recommendations and custom ads) that are based on the information stored in your account. The e-mail indicates the main features of the Google Account and the existence of control tools.

With regard to the confidentiality check-up tool, this essentially allows the user to set up the collected information such as browsing history or visited places. Finally, the Dashboard consists of an information panel for each Google service overview of the use of the account holder.

Nevertheless, these confidentiality check-up and Dashboard tools can only be used, just like the e-mail mentioned above, after the account creation step, which is nevertheless essential for informing users as well as it has been said. In addition, although their existence and interest are brought to the attention of users, they imply an active and initiative of these. For these reasons, these tools do not allow to consider that sufficient information is provided for the application of Article 13 of the Regulations.

In the light of all these elements, the CNIL committee considers that a breach of the transparency and information obligations as provided for in Articles 12 and 13 of the Regulation is characterised.

5. Failure to provide a legal basis for the processing implemented

Article 6 of the RGPD states that: 1.   Processing shall be lawful only if and to the extent that at least one of the following applies:

(a)    the data subject has given consent to the processing of his or her personal data for one or more specific purposes;

(b)    processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;

(c)    processing is necessary for compliance with a legal obligation to which the controller is subject;

(d)    processing is necessary in order to protect the vital interests of the data subject or of another natural person;

(e)    processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;

(f)    processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.

the processing is necessary to safeguard the vital interests of the data subject or of another natural person;

Google was criticised for failing to validly collect the consent of individuals for targeted advertising. It was also considered that the company could not claim a legitimate interest for these processing.

In defence, Google specified that it relied solely on the consent for the targeted advertising.

Article 4 (11) of the above-mentioned Regulations specifies what consent means: any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her;

Article 7 of the same Regulation provides the conditions that apply to it:

  • 1.   Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data.
  • 2.   If the data subject’s consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding.
  • 3.   The data subject shall have the right to withdraw his or her consent at any time. The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. Prior to giving consent, the data subject shall be informed thereof. It shall be as easy to withdraw as to give consent.
  • 4.   When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia , the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.

Firstly, Google said that the users’ consent is informed.

It believes that simple and clear information is presented to the user when creating an account and allow him to know how the company uses the data for customisation purposes. In particular, the Company refers to the summary entitled Privacy Policy and Terms of Use, the sections dedicated to the targeted advertisements contained in the Privacy Policy and the additional information message entitled Targeted Advertising in the Menu Options account creation setting.

Secondly, Google claimed that the user’s consent is specific and unambiguous.

In particular, it stated that when setting up the account, the user has the option of making a choice regarding the display of targeted advertising . It considers that this possibility enables to express its consent to the use of its data independently of the other choices it may express with regard to the other purposes relating to the processing associated with the Google Account (eg YouTube search history).

It also considered that the procedures for obtaining consent for the purposes of targeted advertisements it sets up are in line with the recommendations of the CNIL of December 5, 2013 regarding cookies. In particular, it specifies that short information is available on the targeted ads followed by an “I accept” button (the Privacy Policy and the Terms of Use), preceded by a “more options” button that gives users the ability to disable multiple processing operations, including for targeted ad purposes.

It also argued that the solution presented in the public notice of the president of the CNIL No. MED-2018-023 of November 29, 2018 allows the user to consent to all ends via a button accept all.

Lastly, it considered that explicit consent for the processing of data for the purposes of targeted advertising, within the meaning of Art 9 (2) (a) of the GDPR, could not be required in the absence of sensitive data.

Regarding the informed consent and the spread of information

In the first place, the CNIL specified that this informed character must be examined in the light of the preceding developments concerning the lack of transparency and information of the users during the setting up of their account. It considered that the deficiencies previously identified necessarily have an impact on the information delivered to the users to ensure the informed nature of the consent.

The CNIL indicated that the EDPS Guideline of 10 April 2018 on Consent within the meaning of Regulation 2016/679 (WP250) states that the controller must ensure that consent is provided on the basis of information that enables the people concerned to easily identify who is responsible for the data and to understand what they are agreeing to. [It] must clearly describe the purpose of the data processing for which consent is sought.

These guidelines also state that: In order for consent to be informed, it is necessary to inform the person concerned of certain crucial elements in making a choice. [..] At least the following information is required in order to obtain valid consent:

  • the identity of the controller,
  • the purpose of each processing operation for which consent is sought,
  • the (types of) data collected and used,
  • the existence of the right to withdraw consent,
  • information on the use of data for automated decision-making [..] and
  • information on the possible risks associated with the transmission of data due to the lack of a decision on adequacy and appropriate safeguards […].

As it was able to identify under the breach of the transparency and information requirements, the CNIL considered that the information on targeted advertising is excessively disseminated in separate documents and is not, as such, easily accessible. In this respect, the CNIL refered to the preceding developments on the multiple actions that must be performed by a user who wishes to be made aware of available on the processing related to the targeted advertisement.

In addition, the CNIL also noted under the breach of transparency obligations, the information provided was not sufficiently clear and understandable in that it was difficult for a user to have a global understanding of the can do the object and their scope.

As an illustration, the information in the targeted Ad section, accessible from the Privacy Policy and Terms and Conditions document through the More Options button, contains the following: Google can show you ads based on your activity in Google services (for example, in the search or on YouTube, as well as on Google’s websites and partner applications). The CNIL noted that it is not possible to read, for example through clickable links, the services, sites and Google application to which the company refers. Therefore, the user was not able to understand the targeted advertising they are subject to, as well as their scope, even though these treatments nevertheless imply a plurality of services (for example: Google search, YouTube, Google home, Google maps, Playstore, Google photo, Google play, Google analytics, Google translation, Play books) and the processing of a large amount of data personal character. Users are not able to have a proper perception of the nature and volume of data that is collected.

In view of these elements, the CNIL committee considered that the users’ consent for targeted advertising was not sufficiently informed.

With regard to the specific and unequivocal nature of the consent

Recital 32 of the Regulation provides that: Consent must be given by a clear positive act by which the data subject expresses in a free, specific, informed and unambiguous manner his agreement to the processing of his personal data […]. There can therefore be no consent in the event of silence, boxes checked by default or inactivity.

Recital 43 of the GDPR states that: Consent is presumed not to have been freely given if separate consent can not be given to different personal data processing operations, although this is appropriate in the present case.

The above EDPS guidance on consent states that: In order to comply with the specific nature of consent, the controller must ensure: […] (ii) the detailed nature of consent requests […] This means that data controller who seeks consent for various specific purposes should provide separate consent for each purpose so that users can give specific consent for specific purposes.

In this case, the CNIL noted that when the user creates an account, he has the possibility to modify some of the parameters associated with the account. To access these settings, the user must click the more options button, present before the Create Account button. the CNIL also noted that the personalisation settings of the account, which contain the choice for the display of personalised ads, are pre-checked by default, which means, unless otherwise indicated, the agreement of the user to processing of its data for the purposes mentioned (eg YouTube search history, display of personalised ads etc.). The user has the option to uncheck these parameters if he does not want these treatments to be implemented.

The CNIL observed that, at the time of the creation of the account, if the user does not click on the button more options to set his account, he must check the boxes I accept the terms of use of Google and I agree that my information will be used as described above and detailed in the Privacy Policy. Subsequently, he must press the Create Account button. A pop-up window appears, titled Simple confirmation that contains the following text This Google Account is set up to include customisation features (such as recommendations and custom ads), which are based on information stored in your account. To change your personalisation settings and the information stored in your account,

If it does not click More Options, the user must select the Confirm button to complete the account creation.

In view of the foregoing, the CNIL noted that if the user has the possibility to modify the configuration of the parameters of his account prior to its creation, a positive action on his part is necessary to access the possibilities of setting up the account. Thus, the user can completely create his account, and accept the related processing, including targeted advertising, without clicking More options. Therefore, the user’s consent is not, in this case, validly collected to the extent that it is not given through a positive act by which the person consents specifically and distinctly to the processing of its data for purposes of targeted advertising compared to other purposes of treatment.

The CNIL also considered that the actions by which the user proceeds to the creation of his account – by ticking the boxes I accept the conditions of use of Google and I accept that my information is used as described below above and detailed in the privacy rules, and then selecting Create Account – can not be considered as the expression of a valid consent. The specific nature of the consent is not respected because the user, by these actions, accepts all the processing of personal data implemented by the company, including those of targeted advertising.

In addition, the CNIL noted that multiple technology processes are used by the company to combine and analyse data from different external services, applications or sources. They undeniably have a multiplying effect on the precise knowledge society has of its users.

As a result, the restricted formation believes that the company has operations of combinations with almost unlimited potential allowing a massive and intrusive treatment of the data of the users.

Given the scope of the data processing – in particular that of targeted advertising – and the number of people concerned, the CNIL underlined that the deficiencies previously retained are of a particular gravity. A lack of transparency regarding these wide-scope processing, as well as a lack of valid user consent to targeted advertising, constitute substantial infringements of the protection of their privacy and are contrary to legitimate aspirations of people who want to keep control of their data.

In this respect, strengthening the rights of individuals is one of the main thrusts of the Regulation. The European legislator recalls that The rapid evolution of technologies and globalization have created new challenges for the protection of personal data. The extent of collection and sharing of personal data has increased significantly. Technologies allow both private and public authorities to use personal data as never before in their business (…) Technologies have transformed both the economy and social relations (recital 6) . It stresses that these developments require a strong data protection framework (…) with rigorous enforcement of the rules, it is important to create the confidence that will enable the digital economy to develop in the internal market as a whole. Individuals should have control over their personal data. (recital 7). The European legislator regrets, finally, that Directive 95/46 / EC has not made it possible to avoid the widespread public perception that significant risks to the protection of individuals still exist, in particular as regards the online environment. (recital 9). whereas Directive 95/46 / EC has not prevented the widespread public perception that significant risks to the protection of individuals still exist, in particular with regard to the online environment. (recital 9). whereas Directive 95/46 / EC has not prevented the widespread public perception that significant risks to the protection of individuals still exist, in particular with regard to the online environment. (recital 9).

The CNIL thus considered, in view of the scale of the processing carried out and the compelling need for the users to keep control of their data, that they must be put in a position to be sufficiently informed of the scope of the data implemented and to validly consent to it, unless to deprive of basic confidence in the digital ecosystem.

Finally, the CNIL emphasised that shortcomings need to be put into perspective with regard to the company’s business model, in particular the place of the users’ data processing for advertising purposes via the Android operating system. In view of the benefits it takes from these processing, Google must pay particular attention to its responsibility under the GDPR in their implementation.

It follows from all the foregoing and the criteria which have been duly taken into account by the Committee, in view of the maximum amount incurred on the basis of 4% of the turnover indicated in point 2 of this Decision, that a financial penalty is justified up to 50 million euros, as well as a supplementary publicising sanction for the same reasons.

Account is also taken of the company’s prominent place in the market for operating systems, the seriousness of the deficiencies and the interest this decision represents in informing the public in determining the duration of its publication.

FOR THESE REASONS

The restricted formation of the CNIL, after having deliberated, decides:

to award Google LLC a monetary penalty in the amount of 50 (fifty) million euros;

to send this decision to Google France Sarl for the execution of this decision;

Google has already publicly announced it has appealed the case. It’s of course important as this decision creates a precedent and establish the rules.

This is only start. La Quadrature du Net published : “Today, the CNIL sanctioned Google to a 50 million Euros fine, stating that the targeted advertising it engages in with its operating system Android is in breach of the GDPR, the new European regulation that came into effect on May 25. However, this sanction is really only the beginning of an answer to our complaint against Google, which denounced especially the targeted advertising imposed on Youtube, Gmail and Google Search in violation of our consent.

In this respect, the CNIL explains the low amount of its sanction, considering Google’s nearly 110 billion US dollars revenue, by the fact it limited the scope of its examination to “the data processing covered by the privacy policy presented to a user when creating their account on their Android mobile phone” (our translation). 1 We therefore expect the CNIL to quickly answer the rest of our complaint, which concerns Youtube, Gmail and Google Search, by issuing a fine commensurate with this company and the extent and the duration of the violation of ours rights (the maximum amount possible is 4 billion euros – 4% of global revenue –, which we hope for).

This publication is licensed under a Creative Commons Attribution 4.0 International
More information on the licence can be found here.



 

Leave a Reply