On 23 September 2025, the California Privacy Protection Agency quietly did something that will reshape privacy governance across the United States for years to come. The state’s Office of Administrative Law approved an extensive package of regulations under the California Consumer Privacy Act (CCPA), updating existing rules and introducing entirely new regimes for cybersecurity audits, risk assessments and automated decision-making technology.
Most of these rules begin to apply from 1 January 2026. Others arrive in stages over 2027 and 2028. Together, they move the CCPA away from being “just” a consumer privacy law and turn it into a fully-fledged system of accountability, with executive attestations, granular documentation and personal responsibility for individuals inside the business.
Boards and executive teams need to understand that the CCPA now requires named individuals to sign filings under penalty of perjury. Those individuals will need to be chosen carefully, supported with training, backed by internal sub-certification processes.
A new phase of CCPA: from consumer rights to corporate accountability
At a high level, the 2026 rules do three important things.
First, they strengthen and clarify the existing CCPA rights and obligations around transparency, opt-outs, sensitive personal information and dark patterns. This is the most visible piece for consumers: clearer notices, stronger protections for minors, cleaner interfaces and more credible rights journeys.
Second, they introduce a deep governance layer. Businesses will now have to perform structured risk assessments for high-risk processing and undergo cybersecurity audits if they meet certain thresholds. These activities are no longer internal hygiene exercises; they are reportable, reviewable and subject to regulator scrutiny.
Third, they bring individual accountability into the heart of privacy compliance. Risk assessments and cybersecurity audit certifications filed with the California Privacy Protection Agency must be signed by named members of the executive management team, and those individuals sign under penalty of perjury. The days of anonymous accountability for CCPA compliance are over.
What really changes on 1 January 2026?
From the start of 2026, a set of updated CCPA rules come into force that most organisations will feel immediately in their day-to-day privacy operations.
The first visible change is mandatory opt-out confirmation. When a consumer opts out of the sale or sharing of their personal information, the business must confirm that the request has been processed. This applies whether the consumer uses on-site controls or an opt-out preference signal such as Global Privacy Control. In practice, this means websites, apps and portals will need to show some form of “opt-out honoured” message or a clear status indicator in privacy settings, rather than silently adjusting a flag in the background.
The second major shift is an enhanced right to know. Under the current regime, businesses must let consumers request access to their personal information, typically covering the 12 months prior to the request. Under the new rules, if a business retains personal data for longer than 12 months, consumers must be allowed to request information collected beyond that one-year period, going back as far as 1 January 2022. Organisations will need to support requests that either specify a date range or simply ask for all personal information held. That requires much better control over data inventories, retention and retrieval than many organisations currently have.
Privacy policies also become more granular. Previously, businesses had to disclose which categories of personal information were shared with “third parties” over the prior 12 months. Going forward, the policy must also spell out which categories are disclosed to “service providers” and “contractors”. This change may sound technical, but it reflects the regulator’s increased focus on vendor relationships, outsourcing and ecosystem risk.
At the same time, the definition of “sensitive personal information” is widened. Any personal information relating to consumers under 16 will be treated as sensitive if the business has actual knowledge of the person’s age. In practice, that captures any context where an organisation is using age gates, relying on operating system or app store age information, or otherwise ingesting age as part of the user journey. The update means more processing will fall into the higher-risk, more tightly restricted category of sensitive data.
The right to limit the use of sensitive personal information is also strengthened. Businesses will be required to present notice of that right through the same channel in which they collect the data. For a mobile app, this means in-app notices and settings; for a connected TV, an on-screen notice that fits the device’s interaction model; for AR and VR environments, context-sensitive notices within the experience. It will no longer be enough to bury the right to limit on a website footer.
Finally, the updated rules expand and clarify what counts as a prohibited dark pattern. Examples include requiring more steps to opt out than to opt in, making a “yes” button more visually prominent than “no”, treating the act of closing a pop-up as consent, nudging users into financial incentive programmes, or creating false urgency around consent decisions. The agency has clearly signalled that it will look not only at what a business asks consumers, but how it asks, and whether the design is genuinely neutral.
New governance obligations: risk assessments, cyber audits and ADMT
Beyond these visible user-facing changes, the most far-reaching part of the new regime is the trio of governance obligations: risk assessments, cybersecurity audits and automated decision-making technology rules.
From 1 January 2026, businesses must conduct formal risk assessments before undertaking processing that presents significant risk to consumers’ privacy. The law lists a number of examples, including the sale or sharing of personal information, processing sensitive data, using ADMT to make significant decisions about individuals, certain profiling activities, training ADMT using personal data, and using technologies such as facial recognition or emotion analysis. There is also an explicit catch-all for any processing that may materially impact consumers’ privacy rights, which means in practice that businesses will need to err on the side of assessing.
These assessments are not tick-box forms. They must set out the purpose of the processing, the categories of personal information involved, how the data is collected, stored and disclosed, the benefits of the processing, the risks to consumers and the safeguards used to mitigate those risks. They must also identify who reviewed and approved the assessment, who is responsible for the processing in question, and who provided information for the assessment, excluding legal advisers. The intention is to force organisations to document not just what they are doing, but who is accountable for the decision to do it.
Alongside risk assessments, the regulations create a cyber-security audit regime for certain businesses. Organisations that cross specific revenue and processing volume thresholds will have to undergo annual cybersecurity audits. These audits must be independent, even where they are conducted internally. The auditor must report to an executive who is not responsible for the cybersecurity programme itself, and management must not attempt to influence the audit findings. The audit must evaluate a wide range of controls, from data inventories and access management to vendor oversight and incident response.
The third governance pillar is the regulation of automated decision-making technology. Where a business uses ADMT to make, or substantially support, significant decisions about consumers in areas such as employment, education, housing, financial services, healthcare or independent contracting, a new set of obligations applies from 1 January 2027. Before using ADMT, the business must give consumers a clear notice explaining that their data will be used in this way, what the consequences may be, and what rights they have. Consumers must be allowed to opt out of the use of ADMT in many of these contexts and must have the ability to access information about how decisions are made and to appeal outcomes. ADMT use and training will almost always trigger the risk assessment requirement as well.
For very large processors, the governance burden goes further. Businesses processing the personal information of at least ten million consumers will be required to publish statistics on how many ADMT opt-out and access requests they receive, comply with and deny. This moves ADMT oversight into the realm of public accountability.
Personal responsibility: who signs, who certifies and who is on the hook?
Perhaps the most striking aspect of the new rules is the way they attach individual responsibility to CCPA compliance. The regulations specify that risk assessment summaries filed with the California Privacy Protection Agency must be submitted by a member of the business’s executive management team. That person must be directly responsible for risk-assessment compliance, must have sufficient knowledge of the assessments to attest to their accuracy, and must have the authority to submit the filing. When submitting, they must provide their name, title and contact details and sign an attestation that the information is true and correct, under penalty of perjury.
Cybersecurity audit filings are even more demanding. The executive who submits the audit certification must again be part of the executive management team and directly responsible for audit compliance, but cannot be the CISO where the audit is conducted internally and the CISO is responsible for cybersecurity operations. The declaration is more detailed: the signatory must certify not only that the information is true and correct, but also that the business has not attempted to influence the auditor’s judgement. Given the broad scope of audit evidence, including interviews with staff and review of documentation, organisations will almost certainly need supporting sub-certifications from senior managers to provide comfort to the signing executive.
Risk assessments themselves must also name individuals in several roles: those who reviewed and approved the assessment, those whose duties include participating in the processing and those who provided factual information. Cybersecurity audit reports must identify up to three individuals responsible for the cybersecurity programme, as well as naming the auditors and setting out their qualifications.
The net effect is to pull many more named people into the orbit of CCPA accountability: executives, programme owners, auditors, functional leaders and contributors. For privacy, legal and compliance teams, this will inevitably trigger conversations about governance structures, decision-making processes and directors’ and officers’ liability insurance.
The CCPA timeline: a staged rollout from 2026 to 2028
Although it is tempting to treat 1 January 2026 as the main deadline, the CCPA changes are designed as a phased programme. In practice, 2026 and 2027 are preparation and build years, and 2028 is the first year in which both cyber audits and risk assessments crystallise into formal filings.
1 January 2026 – Core obligations begin
- Mandatory opt-out confirmation signals go live.
- Updated Right to Know obligations take effect (requests can extend back to 2022).
- Privacy policies must identify disclosures to service providers and contractors.
- Definition of sensitive PI expands to include data relating to consumers under 16.
- Channel-specific Right to Limit notices required across apps, devices and AR/VR.
- Dark pattern prohibitions become enforceable.
- Risk assessments required for all new high-risk processing activities from this date forward.
Throughout 2026
- Begin performing risk assessments for all significant-risk processing undertaken after 1 January 2026.
- Build governance structures for ADMT, cybersecurity audits and executive attestations.
1 January 2027 – ADMT obligations take effect
- Pre-use notices, opt-out rights and access/appeals mechanisms for ADMT become mandatory.
- First cybersecurity audits begin for businesses in scope.
31 December 2027 – Deadline for legacy activities
- All high-risk processing activities that began before 1 January 2026 must be assessed by this date.
1 April 2028 – First filing deadline
- Businesses with revenue >$100 million must submit cybersecurity audit certifications.
- First risk assessment summaries must be filed for all covered activities initiated before 2026.
1 April 2029
- Cybersecurity audit certifications due for businesses with revenue between $50 million and $100 million.
1 April 2030
- Cybersecurity audit certifications due for businesses with revenue under $50 million.
CCPA 2026 Compliance Checklist
Data Rights & Transparency
- Implement visible confirmation signals for all opt-out requests, including Global Privacy Control (GPC).
- Update privacy policies to identify disclosures to service providers and contractors.
- Enable extended Right to Know requests covering data collected back to 1 January 2022.
- Add channel-specific Right to Limit notices across apps, connected devices and AR/VR environments.
- Remove any interface designs that could constitute prohibited dark patterns.
Sensitive Personal Information
- Update data maps to flag personal information of consumers under 16 as sensitive PI.
- Expand procedures for limiting the use of sensitive PI, including age-gated and app-store–derived age data.
- Implement secure processes for confirming (but not disclosing) sensitive PI such as SSNs or licence numbers.
Risk Assessments
- Identify all processing activities triggering the “significant risk” standard.
- Develop a formal risk assessment template covering purpose, PI categories, benefits, impacts and safeguards.
- Establish a governance workflow for review and approval, including documenting who contributed.
- Begin conducting assessments for all covered activities post–1 January 2026.
- Build a retention process for all assessments (minimum five years or duration of processing).
Cybersecurity Audits
- Determine whether the business meets audit thresholds based on revenue and processing volumes.
- Decide whether to use an external auditor or set up an independent internal audit structure.
- Map your current controls against the CCPA audit criteria and identify gaps.
- Prepare reporting lines ensuring the internal auditor does not report to the CISO.
- Establish evidence repositories for audit review and CPPA certification.
Automated Decision-Making Technology
- Map all ADMT systems, including HR screening tools, financial decision engines and claim evaluators.
- Identify “significant decisions” subject to ADMT regulation.
- Draft and test pre-use notices and explanations of ADMT logic and outcomes.
- Build opt-out and appeals mechanisms for ADMT use.
- Prepare to publish ADMT request metrics if processing ≥10 million consumers’ data.
Governance & Accountability
- Identify executives responsible for submitting risk assessment and audit certifications.
- Implement internal sub-certification processes to support executive attestations.
- Update D&O liability insurance to cover new personal accountability risks.
- Document governance structures linking privacy, cybersecurity, IT, HR and AI governance teams.