California Unveils New Draft Requirements for Privacy Risk Assessments, Cybersecurity Audits and AI | BakerHostetler | #hacking | #cybersecurity | #infosec | #comptia | #pentest | #ransomware

On August 29, 2023, the California Privacy Protection Agency (CPPA) Board unveiled Draft Regulations on Risk Assessment and Cybersecurity Audit. The CPPA Board announced that the formal rulemaking process will begin soon, which we anticipate may begin shortly after the public Board meeting on September 8, 2023.

While the rulemaking process continues, the Draft Regulations provide a glimpse into future requirements on businesses. We provide a high-level summary of what is new and what is likely to have the most impact on business operations.

Draft Risk Assessment Regulations

Like the EU GDPR and Colorado Privacy Act’s requirements on data protection impact assessments, the California Draft Regulations create significant compliance obligations for businesses. In addition, the Draft Regulations provide new definitions of “artificial intelligence”[1] and “automated decision-making technology”[2]—two types of processing activities which will trigger the need for such a risk assessment. In particular, the Draft Regulations:

  1. Provide examples that highlight processing activities posing a significant risk to consumers’ privacy and necessitating a risk assessment including:
  1. Selling or sharing personal information
  2. Processing sensitive personal information
  3. Using Automated Decisionmaking Technology in furtherance of a decision that results in the provision or denial of certain financial services, employment or access to essential goods, services, or opportunities.
  4. Processing the personal information of consumers that the business has actual knowledge are less than 16 years of age.
  5. Processing the personal information to monitor employees, independent contractors, job applicants, or students.
  6. Processing the personal information of consumers in publicly accessible places using technology to monitor behavior, location, movements, or actions.
  7. Processing the personal information of consumers to train artificial intelligence or Automated Decisionmaking Technology.
  1. Give illustrative examples where businesses are obligated to perform risk assessments. For examples, a technology provider that processes consumers’ photographs and extract faceprints from them to train facial-recognition technology must conduct a risk assessment because it seeks to process consumers’ personal information to train AI. Likewise, a personal-budgeting application processing consumer’s income information to target these consumers with ads on different websites for payday loans must conduct a risk assessment because the business is “sharing” personal information.
  2. Enumerate ten categories of information required to be included in risk assessments. Businesses that have conducted risk assessments under GDPR or the Colorado Privacy Act will be familiar with many of these requirements including the categories of data to be processed and the purpose for the processing as well as an identification of the risks and benefits of the processing and plans to mitigate such risks. Businesses must also describe “consumers’ reasonable expectations concerning the purpose for processing their personal information, or the purpose’s compatibility with the context in which their personal information was collected.”
  3. Enumerate additional requirements for businesses using Automated Decision-Making Technology or that process personal information to Train AI or Automated Decisionmaking Technology. Specifically, the California Regulations would require plain language explanations of the logic of the AI technology including any assumptions of the logic, the degree and details of any human involvement in the business’s use of AI, and any safeguards that the business plans to implement to address the negative impacts to consumers’ privacy that are specific to its use of Automated Decisionmaking Technology or for data sets produced by or derived from the Automated Decisionmaking Technology.
  4. Emphasize the stakeholder involvement in risk assessments as businesses are required to involve all individuals responsible for preparing, contributing to, or reviewing the risk assessment, including product, fraud prevention, or compliance teams. Notably, the Board is considering requiring risk assessments to specify, among other things, the names and titles internal actors and external parties contributing to the assessment, the names, positions, and signatures of the individuals responsible for the review and approval of the assessment, and the name and title of the highest-ranking executive with authority to bind the business, and who is responsible for oversight of the business’s risk-assessment compliance along with a signed certification that the executive has reviewed, understands the contents of, and approved the risk assessment. These requirements are similar to the New York Department of Financial Services cybersecurity filing requirements which include ass certification of compliance by the covered entity’s board or senior officers.
  5. Mandate that businesses that are engaged in processing activities posing a significant risk to consumers’ privacy submit risk assessments to the CPPA or the California Attorney General on request, and submit to the CPPA on an annual basis the business’s risk assessments in an abridged form as well as a certification by a designated executive that the business has complied with the requirements set forth in the regulations. The timing of these reporting and certification requirements are expected to be discussed and finalized during the next Board meeting.

The Draft Regulations permit businesses to conduct a single risk assessment for a “comparable set of processing activities”—i.e., those processing activities that present similar risks to consumers’ privacy. For example, a toy store is considering using in-store forms to collect children’s names, addresses, and birthdays to use that data to mail promotional items to those children during their birth month and every November. The store uses the same service providers and technology for each category of mailings across all stores. The toy store will need to conduct a risk assessment under these new California Regulations because it is processing personal information of consumers under 16 years of age. However, the store may use a single risk assessment for processing the personal information for the birthday mailing and November mailing across all stores because in each case it is collecting the same personal information in the same way for the purpose of sending promotions to children and this processing presents similar risks to consumers’ privacy.

The Draft Regulations also alleviate the need to conduct a duplicative risk assessment if the business has already conducted and documented a risk assessment under another law or regulation that meets all the requirements of the CCPA Regulations. No examples were given how the business may take advantage of this exception and this is potentially a topic that we may see further discussion as part of rulemaking.

Draft Cybersecurity Audit Regulations

The Draft Regulations relating to cybersecurity audits require that businesses engaged in processing consumers’ personal information in a manner that poses a substantial risk to their privacy or security are obligated to conduct a yearly cybersecurity audit. Specifically, the Draft Regulations:

  1. Require every business whose processing of consumers’ personal information presents “significant risk to consumers’ security” to complete a cybersecurity audit. Processing is deemed to be a “significant risk to consumers’ security” if the business meets the threshold set forth in Civil Code section 1798.140, subdivision (d)(1)(C)—i.e., it derives 50 percent or more of its annual revenues from selling or sharing consumers’ personal information in the preceding calendar year or the business meets one of three thresholds currently under CPPA consideration.[3]
  2. Contain comprehensive requirements for conducting cybersecurity audits including the name(s) and title(s) of the qualified employee(s) responsible for the business’s cybersecurity program; and the date that the cybersecurity program and any evaluations thereof were presented to the business’s board of directors or governing body or, if none exists, the highest-ranking executive of the business responsible for the business’s cybersecurity program. The audit must describe, among other things, safeguards the business uses to protect personal information from internal and external risks. The Draft Regulations also specify auditor qualifications to ensure such audits are conducted in an independent and objective manner. The Draft Regulations presented, and the Board will consider during the next Board meeting two options for what the business needs to document to show that it protects against negative impacts to consumer’s security (e.g., unauthorized access, economic, physical, psychological, and reputational harm).
  3. State that a business shall have 24 months from the effective date of the regulations to complete its first cybersecurity audit. Thereafter, cybersecurity audits shall be completed annually, and there shall be no gap in the months covered by successive cybersecurity audits.
  4. Mandate that businesses that is required to complete a cybersecurity audit and must notify the CPPA of their compliance status, either through a written certification of adherence to regulatory requirements over a 12-month audit period, or by acknowledging non-compliance, identifying areas of non-compliance, and providing a timeline for remediation or confirmation of completed remediation.
  5. Require that service provider or contractor shall, with respect to personal information that they collected pursuant to their written contract with the business, cooperate in the completion of its cybersecurity audit by, among other things, making available to the business’s auditor all relevant necessary information for the auditor to complete cybersecurity audit.

What’s Next?

During the next September 8 Board meeting, the Board will discuss the following agenda items:

  1. The extent to which internal actors responsible for the risk assessment should be identified in those assessments;
  2. The frequency with which businesses would be required to review and update risk assessments;
  3. Thresholds for what types of business and activities presents significant risk to consumers’ security; and
  4. Documentation requirements for cybersecurity audits.

[1] For purposes of the Draft Regulations, Artificial Intelligence “means an engineered or machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations, or decisions that influence physical or virtual environments. Artificial intelligence includes generative models, such as large language models, that can learn from inputs and create new outputs, such as text, images, audio, or video; and facial or speech recognition or detection technology.”

[2] For the Draft Regulations, Automated Decision-Making Technology means “any system, software, or process—including one derived from machine-learning, statistics, other data-processing techniques, or artificial intelligence— that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking. Automated Decisionmaking Technology includes profiling. “Profiling” means any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”

[3] The three options are as follows:

  • Option 1: The business meets the threshold set forth in Civil Code section 1798.140, subdivision (d)(1)(A), [“As of January 1 of the calendar year, had annual gross revenues in excess of twenty five million dollars ($25,000,000) in the preceding calendar year. . . “]; and (A) Processed the personal information of [TBD / one million or more consumers or households] in the preceding calendar year; or (B) Processed the sensitive personal information of [TBD / 100,000 or more] consumers in the preceding calendar year; or (C) Processed the personal information of [TBD / 100,000 or more] consumers that the business had actual knowledge were less than 16 years of age in the preceding calendar year.
    • Option 2: The business has annual gross revenues in excess of [TBD].
    • Option 3: The business had more than [TBD] employees.

[View source.]


Click Here For The Original Source.

National Cyber Security