CCPA 2026 Compliance: Navigate California’s New Privacy Rules
California just raised the stakes on data privacy. Again.
Key Takeaways
- Neural Data Is Now Sensitive Personal Information. The definition of sensitive personal information now explicitly includes neural data—information generated by measuring activity in a consumer's nervous system. This reflects growing concerns about brain-computer interfaces and neurotechnology devices that can reveal intimate details about a person's thoughts, emotions, and mental states.
- All Data From Minors Triggers Extra Protections. If your business collects personal information from anyone under 16, that data is now automatically classified as sensitive personal information. This change alone could require significant updates to privacy notices and data handling practices for businesses that collect age or birthdate information.
- The 12-Month Lookback Period Is Gone. Consumers can now request access to all personal information a business has collected about them, not just the past 12 months. Businesses must implement systems to handle these expanded requests for any data collected since January 1, 2022.
- Dark Patterns Face Direct Prohibition. The new regulations provide specific examples of what constitutes a "dark pattern" in consent interfaces—from asymmetrical button designs to false urgency tactics. Closing a consent popup without clicking accept no longer counts as consent.
- Automated Decision-Making Gets New Rules. By January 2027, businesses using AI and automated systems for significant decisions in areas like employment, lending, housing, and healthcare must provide pre-use notices and offer consumers the right to opt out.
If you thought your business was CCPA-compliant, the California Privacy Protection Agency would like a word. New regulations that took effect January 1, 2026, have fundamentally changed how businesses must handle consumer data, obtain consent, and deploy automated decision-making technology. And the deadlines for additional requirements—including mandatory cybersecurity audits—are already counting down.
Here’s the uncomfortable truth: Many businesses that believed they were playing by the rules are now technically out of compliance. The expanded definition of sensitive personal information, stricter consent requirements, and new obligations around automated decision-making mean that yesterday’s privacy program might not cut it today.
This guide breaks down exactly what changed, why it matters, and what your business needs to do about it. No legal jargon, no hand-waving—just practical information you can actually use.
The Big Picture: Why These Changes Matter Now
California has always been ahead of the curve on privacy regulation. These 2026 updates represent the California Privacy Protection Agency’s response to how technology has evolved—and how some businesses have gamed the system.
The new regulations tackle three main problems: emerging technologies like neural interfaces creating new privacy risks, businesses getting creative with “dark patterns” that manipulate consumers, and the recognition that 12 months of records doesn’t capture what companies actually know about us.
For businesses, compliance isn’t just about checking boxes anymore. The Agency is looking at how privacy functions in real-time user experiences—not just what’s written in a privacy policy nobody reads.
Understanding the Expanded Definition of Sensitive Personal Information
The California Privacy Protection Agency didn’t just tweak the definition of sensitive personal information—it rewrote it for the age of neurotechnology and heightened awareness of child privacy.
Neural Data: The New Privacy Frontier
Neural data joins the list of sensitive personal information categories, defined as information generated by measuring the activity of a consumer’s central or peripheral nervous system. This covers EEG headsets, brain-computer interfaces, and some advanced fitness wearables that track neurological signals.
Neural data can potentially reveal things about a person that even they don’t consciously know—stress levels, emotional states, cognitive function. If your business collects any form of neural data, you’re now dealing with sensitive personal information, which triggers additional disclosure requirements and limits how you can use that data.
The Minor Data Bombshell
If you collect personal information and you know (or reasonably should know) that it belongs to someone under 16, that data is now sensitive personal information. Full stop.
Think about what this means. If your website asks for a birthdate during account creation, you might be collecting sensitive personal information without realizing it. If your e-commerce platform sells products to teenagers, same issue. The regulations don’t require intent—they require awareness.
Businesses may need to implement age verification systems, update privacy notices to address minor data specifically, and create opt-out mechanisms for sensitive personal information associated with users under 16.
The Death of the 12-Month Limitation
For years, the CCPA‘s “right to know” came with a built-in limitation: Consumers could only request information about personal data collected in the previous 12 months. That limitation is now effectively dead.
If a business retains a consumer’s personal information for longer than 12 months, it must provide a way for consumers to request access to all of it. The only exception is for data collected before January 1, 2022.
For businesses, this creates new challenges. You need systems capable of retrieving historical data in response to consumer requests and processes for handling date range specifications. This is also a good time to audit what you’re actually retaining—every record you keep is a potential liability under these expanded access rights.
Consent, Dark Patterns, and the End of Manipulation
The new regulations take direct aim at the tricks some businesses have used to manufacture consent that isn’t really consent at all.
What Counts as Consent (and What Doesn’t)
A critical change: Closing or clicking away from a consent popup—without affirmatively clicking an “accept” button—does not constitute consent. This eliminates a common practice where businesses treated any popup interaction as implied consent.
The Dark Pattern Hit List
The Agency provided specific examples of prohibited dark patterns. Asymmetrical choices are out. If your “Yes” button is bigger, brighter, or more prominently displayed than your “No” button, that’s a dark pattern. If your only options are “Yes” and “Ask Me Later” with no clear way to say no, that’s a dark pattern. If opting into a program is selected by default, that’s a dark pattern.
The regulations also prohibit creating a false sense of urgency. No more countdown timers or pressure tactics. And crucially, the number of steps required to opt out must be equal to or fewer than the number of steps required to opt in.
The underlying principle is symmetry—making it just as easy to say no as it is to say yes. Look at your current consent flows. Count the clicks, measure the button sizes, examine the color choices. If there’s any imbalance favoring your preferred option, you’ve got work to do.
New Opt-Out Confirmation Requirements
The regulations now require businesses to confirm that opt-out requests have been processed—whether the request came through a cookie banner, a website link, or a universal opt-out signal like Global Privacy Control.
One way to satisfy this requirement is to display an “Opt-Out Request Honored” message immediately following the request. Your systems need to process opt-out requests in real time—batch processes that run overnight won’t cut it.
Notice Timing: Meeting Consumers Where They Are
Privacy notices must be delivered “before or at the time of collection.” For smart TVs, smartwatches, connected home devices, and AR/VR applications, this means building privacy notices into the device experience itself—not burying them in terms of service accepted months ago.
For VR and AR, notice is required before or at the time a consumer “enters or encounters the business within the virtual reality environment.” This forces businesses to think about privacy as part of user experience design, not just a legal afterthought.
Automated Decision-Making Technology: The 2027 Deadline
The regulations establish significant new obligations for businesses using automated decision-making technology (ADMT)—with compliance required by January 1, 2027.
What Qualifies as ADMT
ADMT is technology that processes personal information and uses computation to replace or substantially replace human decision-making. This covers AI-powered hiring tools, automated credit decisions, and algorithmic systems that affect significant outcomes.
Where the Rules Apply
Requirements specifically target ADMT used in financial services, lending, housing, education, employment, and healthcare—areas where automated decisions most significantly impact people’s lives.
Key Requirements
- Before using ADMT for significant decisions, businesses must provide a pre-use notice explaining the purposes for using ADMT, the consumer’s right to opt out, and how consumers can request information about how the technology works.
- Consumers will have the right to opt out of ADMT being used for decisions that significantly affect them, meaning businesses may need to maintain parallel processes that don’t rely on automated systems.
Cybersecurity Audits: The Compliance Timeline
Starting in 2028, businesses whose data processing poses significant security risks must conduct annual cybersecurity audits. Deadlines are staggered by size: April 1, 2028, for businesses over $100 million in revenue; April 1, 2029, for $50-$100 million; and April 1, 2030, for smaller businesses.
Audits must cover cybersecurity program components, authentication mechanisms, encryption of data at rest and in transit, and account management controls. This isn’t a paper exercise—start evaluating your security posture now.
Risk Assessments: New Reporting Requirements
Businesses engaging in certain data practices—selling or sharing personal information, processing sensitive information, or using ADMT for significant decisions—must conduct formal risk assessments and submit reports to the Agency.
Most businesses will need to submit their first reports by December 31, 2027, or April 1, 2028, with annual reports required thereafter. These assessments must evaluate potential privacy harms and document safeguards—proactive compliance demonstrating you’ve thought through the implications of your data practices.
Practical Next Steps for Compliance
The scope of these changes requires action on multiple fronts. Start by auditing your data practices: What information are you collecting? From whom? Could any subjects be minors? Are you collecting neural data? How long are you retaining data?
Examine your consent interfaces against the dark pattern prohibitions. Are your choices symmetrical? Are buttons equally prominent? Is opting out as easy as opting in?
Review your opt-out processes—can you confirm requests immediately? If you use ADMT in regulated contexts, start planning for 2027 requirements now. And if you meet revenue thresholds, begin preparing for cybersecurity audits.
What This Means for Your Business
The CCPA 2026 regulations represent a significant expansion of consumer privacy rights and business compliance obligations. They address emerging technologies, close loopholes in consent practices, and establish new transparency requirements for automated decision-making.
The California Privacy Protection Agency has made clear that it expects privacy to function in real time—in the actual experiences consumers have with products, services, and websites. Paper compliance isn’t enough. Businesses need to examine how their systems actually work and whether those systems truly respect consumer choice and autonomy.
The good news is that most of the heaviest requirements have staggered deadlines, giving businesses time to prepare. The bad news is that some requirements—including the expanded definition of sensitive personal information and the new consent standards—are already in effect. If you haven’t started your compliance assessment, today is the day to begin.
California continues to lead on privacy regulation, and other states are actively watching and often following suit. Getting this right isn’t just about avoiding enforcement actions in California—it’s about building the kind of privacy program that will serve your business well as privacy expectations continue to evolve nationwide and potentially globally.
The businesses that treat these regulations as an opportunity rather than a burden will find themselves ahead of the curve. Consumer expectations around data privacy are only going in one direction, and the companies that get ahead of those expectations now will have a significant competitive advantage over those scrambling to catch up later.
Frequently Asked Questions
Any business that conducts operations in California and meets certain thresholds must comply with CCPA regulations, including the 2026 updates. This generally includes businesses with gross annual revenues over $25 million, businesses that buy, sell, or share personal information of 100,000 or more consumers or households annually, or businesses that derive 50% or more of annual revenue from selling or sharing consumer personal information. The expanded requirements around cybersecurity audits and risk assessments have additional size-based thresholds.
If your business collects information that indicates a user is under 16—such as birthdate, age, or grade level—that user’s personal information is now classified as sensitive personal information. This classification triggers additional requirements including enhanced disclosure obligations in privacy notices and the requirement to provide mechanisms for consumers to limit the use or collection of their sensitive personal information. Businesses that previously collected age data for routine purposes like age verification may need to reassess their practices and update their privacy programs accordingly.
A dark pattern is a user interface design choice that subverts or impairs consumer autonomy, decision-making, or choice. Under the 2026 regulations, specific examples include consent interfaces where the accept button is larger or more prominently colored than the decline button, choice architectures that present only “Yes” and “Ask Me Later” without a clear option to decline, pre-selected defaults that favor data classification or opt consumers into programs automatically, and creating false urgency to pressure immediate decisions. The fundamental principle is that opting out must be as easy as opting in, with equal visual prominence and equal number of steps.
Businesses whose processing of consumer personal information poses a significant risk to consumer security must conduct annual cybersecurity audits starting in 2028. The deadline depends on business size: April 1, 2028, for businesses with over $100 million in annual gross revenue; April 1, 2029, for businesses between $50 million and $100 million; and April 1, 2030, for smaller businesses. Audits must be conducted by qualified auditors (internal or external) and must assess cybersecurity program components, authentication practices, encryption protocols for data at rest and in transit, and account management and access controls.
Starting January 1, 2027, businesses using ADMT in financial services, lending, housing, education, employment, or healthcare must provide consumers with a pre-use notice before the technology is used to make or influence significant decisions about them. This notice must explain the purposes for using ADMT, describe the consumer’s right to opt out of such processing, and explain how consumers can request additional information about how the technology works. Consumers will have the right to opt out of ADMT being used for decisions that significantly affect them, which means businesses may need to maintain alternative decision-making processes.
Businesses must now provide consumers with access to all personal information collected about them, not just data from the previous 12 months. The only exception is for personal information collected before January 1, 2022. To comply, businesses must implement methods for consumers to specify the date range for their access request or offer them the option to request all collected information. This requires businesses to maintain accessible records of historical data collection and have systems capable of retrieving that data in response to consumer requests.
Businesses engaging in data practices that pose significant privacy risks—such as selling or sharing personal information, processing sensitive information, or using ADMT for significant consumer decisions—must conduct risk assessments and submit reports to the California Privacy Protection Agency. Depending on when the triggering activity first began, initial reports are due either by December 31, 2027, or April 1, 2028. After the initial submission, businesses must submit updated risk assessment reports annually. These assessments must evaluate potential privacy harms and document implemented safeguards.