California Privacy Protection Agency Publishes Draft Cybersecurity Audit and Risk Assessment Regulations, Discusses at Public Meeting | WilmerHale | #hacking | #cybersecurity | #infosec | #comptia | #pentest | #ransomware


On Friday, September 8, the California Privacy Protection Agency (CPPA) held a public board meeting. The primary topic of discussion at this meeting was the Agency’s draft regulations on cybersecurity audits and risk assessments. These regulations — which were previewed at the Board’s previous meeting in July — were developed in part based on preliminary public comments that the Board solicited earlier this year.

The requirements imposed by these draft regulations would be substantial, but the question of their scope of applicability is clearly one that the Board is still grappling with. Indeed, much of the September 8 Board meeting centered on the question of how broadly these regulations should apply, as evidenced by Board members’ discussions regarding, for example, the applicability thresholds within the cybersecurity audit regulations and the expansive definitions of artificial intelligence and automated decisionmaking technology (ADMT) articulated in the risk assessment regulations. An additional takeaway from the meeting and draft regulations is that the CPPA is clearly focused on AI and ADMT as areas in which it should exercise regulatory oversight. The risk assessment regulations, for instance, impose additional requirements on businesses using or training AI and ADMT models, as compared to companies engaging in run-of-the-mill personal information processing. And importantly, the CPPA is still working on a separate set of ADMT-specific regulations, a draft version of which is likely to be released later this year. 

In this post, we summarize notable features of the Board’s cybersecurity audit and risk assessment regulations and highlight key elements of the discussion surrounding these regulations at the September 8 Board meeting. 

Cybersecurity Audit Regulations

Notable elements of the Board’s draft cybersecurity audit regulations include: 

  • Applicability Thresholds: The cybersecurity audit regulations would apply to businesses whose “processing of consumers’ personal information presents significant risk to consumers’ security.” The draft regulations define the category of businesses “present[ing] significant risk to consumers’ security” as including businesses that derive at least 50% of their annual revenues from the sale or sharing of consumers’ personal information (essentially, a data broker). However, the draft regulations also propose several additional options for thresholds that would trigger the audit requirements, namely: (1) a revenue threshold paired with a personal information processing threshold (e.g., a company that has annual revenues exceeding $25 million and processes the sensitive personal information of at least 100,000 consumers); (2) a revenue threshold; or (3) a number-of-employees threshold.
  • Qualified and Independent Auditor: The regulations would require that cybersecurity audits be performed by “a qualified, objective, independent professional … using procedures and standards generally accepted in the profession of auditing.” Notably, these audits could be performed by an individual internal or external to the business, so long as they retain independence. 
  • Required Elements: The regulations require that cybersecurity audits document and assess a business’s implementation of a wide range of safeguards, including, for example: multi-factor authentication, encryption (at rest and in transit), zero trust architecture, account management and access controls, data inventories, secure hardware and software configurations, vulnerability scans and penetration testing, network monitoring and defenses, oversight of service providers and contractors, data retention policies, and incident response. If the auditor determines that a particular safeguard is not necessary for the business, they must “explain why the component is not necessary to the business’s protection of personal information and how the safeguards that the business does have in place provide at least equivalent security.”
  • Timing: Businesses would be required to conduct a cybersecurity audit within 24 months of the effective date of the regulations, with annual audits to follow thereafter. 

Risk Assessment Regulations 

Notable elements of the Board’s draft risk assessment regulations include:

  • Applicability Thresholds: The regulations would require the performance of a risk assessment by any business “whose processing of consumers’ personal information presents significant risk to consumers’ privacy.” The regulations then identify a series of activities that present such “significant risk[s],” including, for example, selling or sharing personal information, processing sensitive personal information, and using ADMT in furtherance of specified decisions. Notably, included in this list is the processing of consumer personal information to train artificial intelligence or ADMT.
  • Minimum Risk Assessment Requirements: The regulations require that valid risk assessments include several components, including: a summary of the relevant processing that poses a significant risk to consumer privacy; categories of personal information processed as part of that processing; the context of the processing; the positive and negative impacts of the processing; safeguards to be implemented to address said negative impacts; and the “operational elements of the processing” (e.g., the business’s “planned method for collecting, using, disclosing, retaining, or otherwise processing” the relevant information). 
  • Additional Requirements for Businesses Using or Training ADMT/AI: The regulations would impose additional requirements on (1) businesses using ADMT; and (2) businesses processing personal information for the purposes of training ADMT or AI models. As to the former, the regulations would require “plain language explanation[s]” of various elements of the business’s use of ADMT, including the outputs of the ADMT process, the logic of the ADMT, the nature of human involvement in the business’s use of ADMT, and steps that the business takes to quality check the ADMT and its use of that technology. As to the latter, the regulations would require businesses that process personal information to train AI or ADMT and make those technologies available to other persons for their own use to “provide to those other persons a plain language explanation of the appropriate purposes for which the persons may use the [technologies],” and to document the provision of such information in its risk assessment. 
  • Broad Definitions of Artificial Intelligence and ADMT: The regulations’ provisions pertaining to AI and ADMT are notable in large part due to the regulations’ broad definitions of those terms. “Artificial Intelligence,” for example, is defined as “an engineered or machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions that influence physical or virtual environments.” “Automated Decisionmaking Technology,” meanwhile, is defined as “any system, software, or process … that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.”

Discussion at the September 8 Board Meeting 

  • Applicability Thresholds for Cybersecurity Audit Regulations: The applicability thresholds for the cybersecurity audit regulations were a key topic of discussion. Board members seemed to generally agree on subjecting data brokers (i.e., entities deriving at least 50% of their annual revenue from selling or sharing personal information) to the requirements, but emphasized the need to define appropriate thresholds for the other proposed applicability triggers, given the substantial obligations that the audit requirements would impose. 
  • Breadth of AI/ADMT Definitions: A key area of discussion in the context of the risk assessment regulations centered on the regulations’ definitions of AI and ADMT. In response to critiques that these definitions were overly broad, Board Member Vinhcent Le observed that the definitions were drawn from various legal sources (e.g., NIST) and appropriately bounded by other provisions of the regulations that constrain the breadth of the definitions. 
  • Draft ADMT Regulations Forthcoming: Notably, the September 8 meeting only featured two of the three sets of regulations that the CPPA is currently developing. The third set of regulations will focus specifically on ADMT, and Board members stated that a draft version of these regulations will likely be published in advance of the next Board meeting (in the November/December timeframe). 
  • Next Steps for Cyber Audit and Risk Assessment Regulations: Board members stated that their goal is to have revised versions of the cybersecurity audit and risk assessment regulations ready for the next Board meeting.

——————————————————-


Click Here For The Original Source.

National Cyber Security

FREE
VIEW