As artificial intelligence (AI) becomes more integrated into modern employment practices, it also introduces new legal and ethical challenges, particularly around privacy and the use of digital likenesses. In response, California enacted Assembly Bill 2602 (AB 2602), effective January 1, 2025, to address these concerns head-on.
The law marks a critical step in protecting workers from the unauthorized creation and use of AI-generated replicas of their voice or visual likeness—especially in industries like
entertainment, broadcasting, marketing, and tech, where such digital tools are increasingly used to replace or augment human presence.
Understanding AB 2602
AB 2602 aims to prevent employers from using AI-generated “digital replicas” in place of employees without clear consent. The law specifically applies to contractual agreements that authorize the creation and use of digital replicas in a manner that would replace the work the individual would otherwise perform in person.
Under this law, a contract clause becomes unenforceable if:
- It allows the creation and use of a digital replica of an individual’s voice or likeness to perform work they would typically perform in person;
- It does not clearly describe the specific uses of the digital replica;
- The individual did not have legal or union representation during the negotiation of that clause.
This three-pronged safeguard is designed to ensure informed, voluntary participation in any contract that permits use of digital likenesses, closing a major loophole that previously allowed for broad and sometimes exploitative use of AI-generated content.
Real-World Example: When “Efficiency” Goes Too Far
Consider the case of Maya, a professional voice-over actor based in Los Angeles. In 2024, Maya signed a contract with a tech company to provide voice-over narration for a virtual fitness app. Months later, she discovered her voice had been synthesized and used to produce hundreds of new audio files she had never recorded, including promotional ads in languages she does not even speak.
The company claimed her original contract allowed them to “use and reproduce her voice in all digital formats,” but the agreement did not specify the creation of a fully AI-generated voice model or how it would be used commercially.
Because Maya was not represented by a union or attorney during contract negotiations, and because the clause lacked a “reasonably specific description” of how her voice would be used, this type of provision would be unenforceable under AB 2602. Maya could potentially bring a legal claim to prevent further use of her AI-generated voice and seek damages.
Why This Law Matters
The misuse of AI-generated replicas has the potential to strip workers of control over their identities and livelihoods. It blurs the line between innovation and exploitation, allowing employers to replicate human labor digitally, sometimes indefinitely, without additional compensation or consent.
AB 2602 reinforces California’s commitment to worker autonomy and privacy, particularly as digital tools evolve faster than most employment laws can track.
Employer Obligations Under AB 2602
Employers, particularly those in entertainment, advertising, and tech, must take the following steps to comply with AB 2602:
- Review and update contracts involving image, voice, or likeness rights to include specific, clearly stated uses of digital replicas.
- Avoid boilerplate language such as “in any and all media now known or hereafter devised” without context.
- Ensure legal or union representation is made available during contract negotiations involving digital likeness use.
- Establish internal AI policies for HR, marketing, and product teams to avoid unauthorized creation or deployment of employee likenesses.
Failure to do so could not only invalidate contracts but also expose employers to lawsuits, public relations backlash, and regulatory action.
What Employees Should Watch For
If you’re asked to sign a contract that mentions voice recordings, likeness usage, or digital rendering, here’s what you should do:
- Ask for specifics. What will your likeness or voice be used for? Where will it appear? For how long?
- Seek representation. AB 2602 protects individuals who are represented by a union or attorney during negotiations.
- Negotiate usage rights. Just because technology allows your likeness to be duplicated doesn’t mean you have to surrender control of it.
- Monitor for misuse. If your voice or image is being used in ways you didn’t authorize, you may have a legal remedy under California law.
Looking Ahead: A Model for AI Protections Nationwide?
AB 2602 may serve as a blueprint for other states seeking to regulate AI’s role in the workplace. As digital tools continue to evolve, the need for comprehensive legal frameworks around AI, employee privacy, and consent is becoming increasingly urgent.
California’s law stands as an early and important recognition of the fact that employee likeness is personal property, not an unlimited asset for employer use. The message is clear: consent and specificity are no longer optional, they are the legal standard.
Conclusion
As artificial intelligence blurs the line between human and machine, California’s AB 2602 brings much-needed clarity to a rapidly evolving issue. By safeguarding employee identities and requiring informed consent for digital likeness use, the law sets a new standard for how employers must approach AI in employment contracts. In the age of automation, protecting the person behind the persona has never been more important.
The blog published by Fairchild Employment Law is available for informational purposes only and is not considered legal advice on any subject matter. By viewing blog posts, the reader understands there is no attorney-client relationship between the reader and the blog publisher. The blog should not be used as a substitute for legal advice from a licensed professional attorney, and readers are urged to consult their own legal counsel on any specific legal questions concerning a specific situation.