Arabic Arabic Chinese (Simplified) Chinese (Simplified) Dutch Dutch English English French French German German Italian Italian Portuguese Portuguese Russian Russian Spanish Spanish
0

Data Protection update – February 2022 | #cybersecurity | #cyberattack | #hacking | #aihp



Welcome to our data protection bulletin, covering the key developments in data protection law from February 2022.

Stephenson Harwood’s data protection hub is now live. The hub is a ‘go-to’ online resource where you can find all our data protection content in one place, including useful materials such as our handy pdf overview of the UK GDPR, guidance on data protection hot topics and our regular insights.

Data protection

Cyber security

Enforcement Data protection

Data protection

New Data Act proposed by EU

On 23 February 2022, the European Commission (“EC“) published its proposal for a Data Act (the “Act“) which aims to improve trust in data sharing and facilitate the sharing of industrial data between connected devices and devices on the Internet of Things (“IoT“). The EC hopes that the Act will help unlock the growth potential of the data economy (estimated by the EC to be worth €270 billion by 2028). As we reported in our January bulletin, the Act is part of a suite of measures within the European Strategy for Data following the political agreement on the European Data Governance Act. The Act is not focussed on personal data, but the data generated by devices on the IoT and other connected devices which, currently, generally pass to the manufacturer. As well as applying to manufacturers, providers and users of connected products and services placed on the market in the European Union, if adopted by EU lawmakers, the Act will also apply to data holders making data available to data recipients, public bodies, and data processors, where relevant.

The key proposals of interest in the Act are:

  • Granting greater access to data manufactured through connected devices by the owners and users of the devices. That includes permitting the sharing of that data to other services, including analytics.
  • Certain contractual terms will be automatically deemed unfair when unilaterally imposed on micro businesses or SMEs, meaning that such terms will not be binding. As to whether this imposes unreasonably upon B2B contractual freedom, recital (52) of the Act explains that only terms unilaterally imposed on certain businesses (micro / SME) will be considered subject to this unfairness term and then only where negotiation has been attempted by the receiving party. This may mean manufacturing parties will have to consider the fairness of their standard terms and conditions where negotiations are requested by purchasers.
  • The Act provides for interoperability standards to enable the re-use of data (including the introduction of the “FRAND” standard, which means where data holders are obliged to make data available, they must do so under fair, reasonable and non-discriminatory (FRAND) terms). The intention is to address the lack of harmonised standards by using minimum essential requirements for smart contracts and provides for further legislation to be provided which will implement common specifications.
  • The Act extends the requirement to use safeguards against unlawful data transfers, outside of the European Economic Area, found within the General Data Protection Regulation (“GDPR“) in terms of personal data to non-personal data.

The Act will be monitored by a competent authority within each member state and shall apply from 12 months after the date of entry into force of the Act, providing it passes the legislative processes within the EU.

ICO Publishes third chapter of anonymisation and pseudonymisation guidance for consultation

The ICO has release the third chapter of its extended consultation into draft guidance on anonymisation, pseudonymisation and privacy enhancing technologies (the “Draft Guidance“). In our June 2021 bulletin we discussed the first chapter of the Draft Guidance (which introduces anonymisation) and in our October 2021 bulletin we discussed the second chapter (which discusses the effectiveness of anonymisation). The third chapter of the Draft Guidance focusses on pseudonymisation and explains the key differences compared to anonymisation.

Pseudonymisation” is defined in the data protection legislation in the UK as processing personal data in a way that it can no longer be attributed to a specific data subject without additional information. These two pieces of information (the processed data and the additional information), when combined, can reconstruct the data, but each has meaning only in combination with the other. That legislation adds that “unauthorised” reversal (i.e. the recombination of the two pieces of information) can specifically result in harm and so the risk of that harm must be mitigated appropriately.

The Draft Guidance also confirms that pseudonymised data is still personal data as it can identify a living individual, albeit indirectly. However, it does suggest that the pseudonymised data may no longer be personal data once transferred to another organisation without the key to re-identifying the individuals involved.

According to the Draft Guidance, the benefits of pseudonymisation are:

  • Risk reduction regarding individual’s rights and enhancing security. The technique limits the level of identifiability in the data to what is necessary and, in turn, reduces the amount of personal data shared. The Draft Guidance mentions the exemption to breach reporting to affected data subjects, under Article 34 UK GDPR, and says that pseudonymisation can form part of the broad technical and organisational measures which, if in place, may permit a data controller to avoid reporting a breach to affected individuals;
  • Supporting re-use of personal data, as a safeguard for the rights and freedoms of the data subjects;
  • Supporting overall compliance; and
  • Building trust and confidence in an organisation’s data processing.

The Draft Guidance also explains how an organisation should approach pseudonymisation: from defining goals and risks to techniques and evaluating outcomes. The consultation is open until 16 September and can be accessed here.

EDPB reviews use of cloud by public sector

The European Data Protection Board (“EDPB“) has begun its first action under the Coordinated Enforcement Framework by launching a review into the use of cloud-based services by the public sector (the “Review“). The Review will cover over 80 public bodies which will be contacted by their local Supervisory Authority to assess compliance with the data protection legislation. The Review does not eliminate individual investigations, and ongoing probes are not necessarily brought within the scope of the action. However, it does mean that targeted investigations currently being carried out by Supervisory Authorities into affected areas are supplemented. One of the key concerns of the EDPB is data transfers out of the EU, in particular to large cloud suppliers in the US following the ruling in Data Protection Commissioner v Facebook Ireland Limited & Maximillian Schrems (Case C-311/18), as we reported here. The French data protection authority, the CNIL, has added to the point by suggesting that these cloud-based services have become essential technologies and so warrant additional attention. The results, as well as any supervision and enforcement actions, will, although aggregated, give deep insight into the topic and allow follow up at the EU-level. That insight is intended to streamline enforcement and cooperation among supervisory authorities. It also aims to “foster best practices to ensure adequate protection of personal data” by public sector bodies across the EU. There is expected to be a state of play report published by the EDPB updating on the Review before the end of this year. The EDPB’s press release can be found here.

EDPB issues guidance on breach notifications

The EDPB has published guidelines on “Examples regarding Personal Data Breach Notification” (the “Guidelines“). The Guidelines set out a number of example scenarios where it would be necessary for data controllers to provide a notification to a supervisory authority under Article 33(1) of the GPDR and, where relevant, to data subjects under Article 34(1) of the GDPR. The examples in the Guidelines are from practice and are under common categories of breaches (e.g. ransomware attacks, human error and lost or stolen devices), with associated mitigation and preventative steps for each scenario along with notification obligations.

The Guidelines categorises data breaches according to the three key information security principles of confidentiality, integrity and availability of data and explores how a breach occurs in each of these:

  • “Confidentiality breach”: where there is an unauthorised or accidental disclosure of, or access to, personal data.
  • “Integrity breach”: where there is an unauthorised or accidental alteration of personal data.
  • “Availability breach”: where there is an accidental or unauthorised loss of access to, or destruction of, personal data.

Although the last category is typically the least harmful to data subjects, the Guidelines identify examples where it could result in a notification to a supervisory authority, for instance, where a health authority no longer has access to patient notes leading to a delay in treatment.

The Guidelines note that a variety of factors can be relevant to establishing when a risk is “high” to individuals but do not repeat the guidance on “likely to result in high risk” processing operations (further to the Article 29 Working Party Guidelines on Data Protection Impact Assessments here). Instead, additional risk factors are considered such as: (i) personal data is exfiltrated but not fully backed up, rendering data not recoverable, and therefore unavailable; (ii) personal data is not secured using state-of-the-art encryption and is therefore readily available; and (iii) personal data is not maintained and compromised data cannot be effectively recovered.

A key emphasis in the Guidelines is on accountability; encouraging every controller and processor to have plans and procedures in place for handling eventual data breaches. This includes recommendations for regular training and awareness and ensuring that organisations have clear reporting lines and persons responsible for breach notification and data recovery processes.

The Guidelines are available to review here.

ABPI consults on health data use principles

The Association of the British Pharmaceutical Industry (“ABPI“) has opened a consultation (the “ABPI Consultation“) on draft principles for the use of health data. The ABPI says that there is “enormous potential” in the data collected and stored by the NHS to improve research and medicine. The ABPI, in its press release, explains that the access to data was invaluable for the effectiveness of the NHS’s response to COVID-19 from vaccines to treatment. The ABPI Consultation looks to build upon these lessons to improve the resilience of the NHS’s ecosystem.

The five principles consulted on in the ABPI Consultation are:

  • Transparency of purpose, which aims to improve clarity and openness about how health data is used by researchers, the expected benefits of the research and the management of risk in the research;
  • Clarity of arrangements, ensuring contractual arrangements with custodians are designed to return ‘fair value’ to both parties;
  • Patient and public involvement and engagement, supporting the trend towards involving patient and public representatives in the design and approval of health data projects;
  • Non-exclusivity of arrangements, supporting the idea that datasets should be available for analysis by any bona fide researchers; and
  • Compliance with laws and regulations.

The ABPI Consultation is open until 11 March 2022 and can be completed here.

Cyber security

UK Government publishes cyber-security review

On 19 January 2022, the UK government published, as part of the new National Cyber Strategy which we reported on last month, a review (the “NCS Review“) of: (i) the robustness of UK companies’ measures to defend against cyber-attacks; and (ii) progress made in cyber-resilience since 2016, against the National Cyber Security Strategy (2016-2021).

The NCS Review highlights the implementation of key primary legislation which has been enacted since 2016 including the UK GDPR, Data Protection Act 2018 and the Security of Network and Information Systems Directive 20018. The NCS Review says that despite the overall positive impact on organisations, and management of cyber risk, that these pieces of legislation have had, there are areas where improvements are needed. The NCS Review in particular notes that the previous approach of the UK Government, and the marketplace, has not delivered the change needed to keep pace with the risk. The NCS Review specifically mentions: application security; network security; supply chain risk management; and continuity planning as areas of improvement. These areas, further complicated by the increase in interconnected and IoT devices, broaden the profile of cyber-risk because of the increased usage and dependence on digital services to companies and the UK economy at large.

A key theme throughout the NCS Review is the perceived lack of commercial rationale for investment in cyber security by organisations. Research quoted in the NCS Review found almost half of business did not see cyber security as a priority or thought it was unlikely they would be a target. That is a somewhat surprising conclusion given that, according to the Cyber Security Breaches Survey 2021, two in five businesses reported a cyber breach in the last 12 months, and most cyber breaches have been unsophisticated attacks relying on a lack of employee awareness.

The desired outcomes of the Review are:

  • Organisational awareness and accountability, including understanding why UK Government advice (such as the “Cyber Aware” campaign) is not reaching a sufficient audience;
  • Organisational resilience, including increasing the general uptake of cyber security schemes (such as “Cyber Essentials”);
  • Increased resilience among essential and digital services to avoid disruption to essential services from cyber-attacks; and
  • Clear information about the cyber security profession, to ensure assessment of competence is consistent allowing employers to have more confidence when hiring practitioners.

FCDO subject to significant cyber-attack

On 4 February 2022, the Foreign, Commonwealth and Development Office (“FCDO“) published a public tender document which reportedly confirms that the FCDO was the subject of a significant cyber-attack. The tender document said that BAE Systems Applied Intelligence had been awarded a contract worth almost half a million pounds without a competitive tender process due to the “extreme urgency brought about by events unforeseeable by [FCDO]“. It was reported that when the contract award was announced, details (which have subsequently been removed from the FCDO’s announcement) of the tender said that the contract was for “business analyst and technical architect support to analyse an authority cyber security incident“. Further reporting by the BBC said that the FCDO systems were accessed inappropriately by hackers but such access was detected. There is no indication that any highly sensitive material was breached. Detecting, along with disrupting and deterring, adversaries was one of the five pillars of the UK Government’s new National Cyber Strategy, which we reported on last month.

Enforcement

Further decisions find that websites’ uses of Google Analytics and Google Fonts breach GDPR

Hot on the heels of the decision of the Austrian Data Protection Authority relating to the use of Google Analytics, which we covered in last month’s update, the Commission Nationale de l’Informatique et des Libertes (the “CNIL“), has now found that an unnamed local website’s use of Google Analytics also breached Article 44 GDPR (which governs transfers of personal data to third countries). In short, the CNIL found that:

  • personal data was being transferred to Google as the combination of Google’s Client ID with several other data (e.g. address of the site visited, metadata about the browser and operating system, time of visit, IP address) made the website’s visitors identifiable; and
  • as: (1) the US does not enable non-US citizens to know how their personal data is used or acquired, nor provide appropriate redress non-US citizens’ personal data is misused; and (2) the standard contractual terms relied on by Google in relation to international data transfers are insufficient to prevent US intelligence services from accessing personal data; there were not “equivalent privacy protections in place”, Article 46 GDPR had not been complied with.

The CNIL has ordered the website to take steps to comply with GDPR forthwith, and, if necessary, will order the website to stop using Google Analytics.

Separately, a regional court in Munich has found that disclosure of a data subject’s IP address by an unnamed website to Google through its use of Google Fonts, a font embedding service library which allows developers to add fonts to their apps and websites. Whilst this data had been transferred to the US, as the user had not consented to either the processing or the transfer, the website owner was held to be in breach of Article 6(1) GDPR.

Meta faces impending suspension order in relation to data transfers

The Irish Data Protection Commission (the “Irish DPC“) has reached a preliminary decision that data transfers between the EU and the US, that Meta carries out, are in breach of Article 44 GDPR. Meta has been given 28 days to respond to this revised decision before it is sent to the European Data Protection Board pursuant to Article 60. It has been reported that the Irish DPC has “asked for legal submissions” and is therefore “not a final decision”.

This decision, which is somewhat less pro-data controller than many other decisions of the Irish DPC, further adds to the challenges faced by organisations in making compliant transfers of personal data to third countries, particularly the US, including as a result of the decisions relating to Google Analytics and Google Fonts highlighted above.

Interactive Advertising Bureau Europe (“IAB Europe“) fined €250,000 by Belgian DPA for GDPR breaches in relation to the Transparency and Consent Framework

IAB’s Transparency and Consent Framework (the “TCF“) facilitates the management of users’ preferences for online personalised advertising and plays a key role in real time bidding (“RTB“). The TCF operates by a consent management platform popping up when first-time users access a website or application. This pop up acts as an interface through which users can either consent to the collection and sharing of their personal data, or object to various types of processing based on the legitimate interests of ad tech vendors. The TCF facilitates the capture of these specific preferences in the form of a code which, alongside cookies, is then placed on the user’s device (the “Code“). This Code can be linked to the user’s IP address, and the user’s preferences are identifiable when combined with their IP address, which facilities the RTB process by allowing advertising to be targeted to users.

Despite IAB Europe’s arguments to the contrary, the Belgian DPA found that:

  • the Code constitutes personal data;
  • IAB Europe acted as a data controller in processing users’ consent, objections and preferences comprising in the Code, notwithstanding that it, itself did not access users’ personal data; and
  • IAB Europe breached GDPR in 4 ways:

– there was a lack of any legal basis for its processing of certain personal data; – where there was a purported legal basis for processing personal data, this could not be relied on as: (1) there was inadequate information provided to users through its consent management platform to permit it to rely on user consent (in essence, it was determined that the RTB ecosystem is so extensive that it would be impossible for data subjects to give an informed consent to the processing of their personal data, and the transparency requirements at Articles 12-14 GDPR could not be met); and (2) the legitimate interest basis did not apply as user’s interests outweighed those of participants in RTB;

– infringing accountability, security and privacy by design measures (including in the context of international transfers); and

– breaching of various controller obligations such as appointment of a data protection officer, maintaining records of processing and conducting data protection impact assessments.

In addition to the imposing a fine of €250,000 on IAB Europe, the Belgian DPA, whose decision had been approved by over 20 other Supervisory Authorities, ordered IAB Europe to carry out corrective measures within six months, including: (i) establishing a valid legal basis for processing and sharing personal data; (ii) prohibiting any use of “legitimate interest” as a basis for processing; and (iii) maintaining a strict audit of organisations joining TCF to ensure that they are in full compliance with the GDPR, in particular, in the context of securing consent to processing users’ personal data.

IAB Europe intends to appeal the Belgian DPA’s decision.

The Information Commissioner’s Office issues an enforcement notice to the Ministry of Justice and a reprimand to the Scottish Government and NHS National Services Scotland

The Information Commissioner’s Office (the “ICO“) has issued an enforcement notice to the Ministry of Justice (“MOJ“) arising out of its failure to comply with a large number of data subject access requests (“SAR“) without undue delay in breach of Article 15 of the EU and UK GDPR.

After a series of communications and meetings between the ICO and MOJ, on 27 August 2021, it was concluded that there were 7,753 overdue SARs (25 had received no response and 7,728 had received only a partial response). Whilst the ICO made it clear that it appreciated the difficulties in dealing with SARs, particular where a response is dependent on the provision of manual and electronic information, and in light of the difficulties caused by the pandemic, the substantial number of outstanding SARs was unacceptable. The ICO required the MOJ to remedy the situation by no later than 31 December 2022.

Separately, the ICO has reprimanded the Scottish Government and NHS National Services Scotland (“NHS Scotland“) in relation to compliance issues relating to the NHS Scotland Covid Status App (the “App“).

The ICO identified that:

  • the privacy notice in respect of the App, was not contained within or linked to from the App, nor could it be located online. It was only after this point was raised on a call was the link subsequently provided; and
  • Although the privacy notice covered the App and NHS Inform website’s COVID certificate service, the provision of paper copies by the National Contact Centre and the National Vaccination Scheduling Service, were “complex, not easily accessible, unnecessarily long and difficult to navigate”.

Accordingly, the Scottish Government and NHS Scotland were in breach of: Article 5(1)(a) of the UK GDPR, which relates to the processing of personal data, including special category data; and Article 12, which pertains to the provision of clear information about the processing of personal data.

The ICO’s reprimand recommended that the privacy notice be redrafted in compliance with Articles 12 and 13, and a copy thereof provided to the ICO by 28 March 2022.

Enel Energia fined €26.5 million for GDPR breaches relating to its telemarking practices

The Italian data protection authority (“Garante“) has found that, in addition to hundreds of complaints concerning unsolicited sales calls which had been made on Enel Energia’s behalf without customer consent, Enel Energia had persistently targeted users not listed in the phone directory or those who had opted out of sales promotions altogether. It was also found that customers’ requests to access their personal data and/or to object to the processing of their data was not responded promptly and in some instances the feedback went missing altogether. Garante found this conduct to be in violation of Articles 5(1)(a), 5(1)(d), 5(2), 6(1), 12, 13, 21, 24, 25(1), 30, and 31 GDPR.

In addition to fining Enel Energia €26.5 million, Garante has required Enel Energia to put in place several measures to ensure compliance with GDPR including ensuring that:

  • all processing of data by its sales network are brought into compliance with appropriate measures being put in place to ensure promotional schemes are activated only with listed numbers; and
  • technical and organisational measure are in place to handle requests by data subjects to exercise their rights, including the right to object to processing for promotional purposes.

Two mobile telecommunications companies fined a total of €9,250,000 by the Hellenic Data Protection Authority

COSMOTE and its parent company OTE, were fined €6,000,000 and €3,250,000 respectively by the Hellenic Data Protection Authority (“Hellenic DPA“) arising out of data breach resulting from unauthorised access to their systems.

The breach concerned the file comprising 30GB personal data, namely the subscriber data of millions of people, including their phone numbers, IMEI, age, gender, and subscription plan. This file was exfiltrated from COSMOTE’s systems pursuant to a hack, which was subsequently notified to the Hellenic DPA. The ensuing investigation identified that:

  • COSMOTE was in violation of Article 35(7) GDPR for failing to carry out the data protection impact assessment, Article 5(1) GDPR for non-compliance with the principle of transparency, and Article 25(1) GDPR for failing to anonymise user data; and
  • OTE was found in violation of Article 32 GDPR for failing to ensure that COSMOTE had in place appropriate technical and organisational measures to prevent the unauthorised access to its systems.

Spanish data protection authority imposes a fine of €3,940,000 on Vodafone for inadequate security measures

Several complaints of fraud due arising out of replication of Vodafone SIM cards were received over time by the Spanish data protection authority (“AEPD“). It was alleged that a replica of the complainants’ SIM cards was obtained through Vodafone, though Vodafone could not identify the identity of the individuals requesting these SIM cards. Those SIM cards were then used to carry out bank transfers from the complainants’ bank accounts and various online banking services.

The AEPD determined that Vodafone was unable to verify identities of those requesting the SIM cards, nor the invoices issued, and therefore Vodafone did not have in place appropriate technical and organisational measures in place to mitigate the risk of identity theft. Consequently, the AEPD concluded that Vodafone was in breach of Articles 5(1)(f) and 5(2) GDPR and fined Vodafone €3,940,000.

Further decision demonstrates that individual claims arising from data breaches ought not to be pursued in the High Court

In a further decision building on the jurisprudence on which we have previously reported here, in Stadler v Currys Group Ltd [2022] EWHC 160 (QB) HHJ Lewis (sitting as a Judge of the High Court) found that a claim seeking damages of up to £5,000 should have been pursued in the Small Claims Court, and not the High Court, which was never an appropriate forum for it to be pursued. In particular, he held:

“the defendant explained to the claimant’s solicitor that judges of this court have made clear recently that these sorts of modest value claim are not suitable for the High Court, or indeed the multi-track. It also drew the claimant’s attention to the obligation on ensuring cases are justly and proportionately managed in accordance with the overriding objective. It is regrettable that the claimant did not take on board this sound advice.”

This developing line of jurisprudence is likely to spell the end of the cottage industry which had developed around the pursuit of such claims, as, if they must be pursued in Small Claims Court, where costs are irrecoverable, the Claimant firms who had been the primary driver of such claims are unlikely to be interested in pursuing them.

The Privacy Collective loses first skirmish in €11 billion class action claim against Salesforce and Oracle

The Privacy Collective (“TPC“) is a Dutch foundation that launched a class action against Salesforce and Oracle alleging that they had unlawfully collected and processed personal data in breach of the GDPR and seeking damages of €11 billion. This was TPC’s first class action under the Dutch Act on Mass Damages Settlement in Class Actions (Wet Afwikkeling Massaschade in Collectieve Actie or “WAMCA“), and the first class action which had been pursued under WAMCA in relation to breaches of GDPR.

WAMCA imposes certain admissibility requirements for class actions. This includes the requirement that the representative claimant, in this case TPC, represents “a sufficiently large proportion of the injured parties”. To meet this requirement, TPC submitted that over 75,000 individuals had opted to support this class action by clicking a “support” button on a Facebook page. This button was next to general text that stated that large amounts of monetary compensation could be claimed from “two large internet companies”, and that the claim concerned mass collection and sale of data belonging to millions of Dutch people who had not consented thereto.

The Court held that the evidence TPC had presented was insufficient to meet the representative requirement as, amongst other things: (i) insufficient information about the case was given to visitors to the Facebook page; and (ii) it was insufficiently clear whether those that clicked the “support” button were injured parties; and (iii) TPC had failed to demonstrate sufficient ability to communicate with the class of affected data sub jects.

Click Here For The Original Source.


————————————————————————————-

National Cyber Security

FREE
VIEW