TCAI Bill Guide: SB 1119 and AB 2023, California’s child safety chatbot bills — Transparency Coalition. Legislation for Transparency in AI Now. | #childsafety | #kids | #chldern | #parents | #schoolsafey

[ad_1]

April 22, 2026 — Three of California’s leading tech-safety legislators have created a pair of companion bills, SB 1119 and AB 2023, aimed at protecting kids online. The bills build on existing digital design and chatbot regulations enacted in 2025.

The two bills contained the same language at introduction. As they move through their respective chambers they may be amended and require reconciliation if the measures cross over.

the bills in brief

SB 1119, sponsored by Sen. Steve Padilla (D-San Diego, Inland Empire). Staff analysis available here.

AB 2023, sponsored by Asm. Buffy Wicks (D-Oakland) and Asm. Rebecca Bauer-Kahan (D-Alameda/Contra Costa counties). Staff analysis available here.

These bills would establish a framework for protecting children from these harms and empowering parents with proper tools to protect their own children. The bill imposes a number of requirements on chatbot operators and prohibits specified conduct. Additionally, operators must submit to annual independent audits and reporting by the Attorney General (AG) is required. Public prosecutors and children harmed by violations are authorized to bring civil actions.

Existing law

California’s existing Digital Age Assurance Act (AB 1043, enacted in Sept. 2025) requires age-verification systems within mobile devices and computers sold on or after Jan. 1, 2027. When an account is set up on a new mobile device or computer, the user must indicate their birth date or age. Using this system, app developers will be able to request a signal from the device regarding the user’s age.

Existing SB 243, enacted in Sept. 2025, requires the operator of an AI chatbot to take precautions when a user is a minor, including:

  • Disclose that the user is interacting with AI

  • Notify the user at least once every 3 hours to take a break

  • Prevent the chatbot from producing sexually explicit content

What the bills would do

SB 1119 / AB 2023 would build on the existing requirements outlined above. These requirements would obtain for operators of chatbots interacting with minors (under age 18) in California.

Age verification: Chatbot operators would be required to verify the age of a user, using the systems required by the Digital Age Assurance Act.

Annual risk assessment: Chatbot operators would be required to perform annual risk assessments to identify child safety risks posed by the chatbot; mitigate those risks; document the measures taken; and publish a child safety policy.

Parental notification re risk of self-harm: Operators would be required to implement a crisis response protocol in the area of self-harm, including parental notification if the child’s account is connected to the parent’s account.

Parental default settings: Operators would be required to implement default settings on a minor’s account that can only be changed by a parent, including push notification prohibitions and time limits.

Push notification limitations: The bill requires the default setting for a companion chatbot to prohibit the chatbot from sending push notifications to a child between 12 a.m. and 6 a.m., and 8 a.m. and 3 p.m. during weekdays. These settings can be changed by a parent.

Parental controls: Operators would be required to offer parental controls, including the ability to set preferences and time limits, and disable access for children under 16.

Advertising limitations: Operators would be prohibited from targeting advertising at a minor chatbot user, including through product placement within conversational chats.

Personal information privacy: Operators would be prohibited from selling, sharing, or using the personal information of a minor chatbot user for any purpose not expressly authorized.

Subject matter limitations on minor accounts: Operators would be required to limit subject matter on minor user accounts. Chatbots would be prevented from encouraging children to self-harm or cause harm to others; prohibited from offering health advice; and prevented from engaging in sexual material. Chatbots would not be allowed to encourage unhealthy behaviors such as consuming narcotics or alcohol, or engaging in disordered eating.

Prohibition on overly sycophantic responses: Chatbots would not be allowed to produce overly sycophantic responses.

Establish public incident reporting mechanism: Operators would need to create a public reporting mechanism that enables a third party to report (to the operator) an incident regarding a child safety risk.

Attorney General’s office: The bills would require the state AG to establish a public incident reporting mechanism for consumers to submit complaints regarding chatbots. The AG would also be required to adopt regulations regarding third-party audits of chatbot operators.

Enforcement: A public prosecutor may bring a civil action for violations, including penalties up to $5,000 per negligent violation and $15,000 for each intentional violation. A child who suffers actual harm may also bring a civil action for actual damages, punitive damages, attorney’s fees, and injunctive relief.

Bill Sponsor’s statement: Asm. Buffy Wicks

[ad_2]

————————————————


Source link

National Cyber Security

FREE
VIEW