
When Someone Feeds Your Voice to AI: A Singer’s Legal Rights in the Age of Voice Cloning
Your voice is your livelihood. Decades of training, countless hours of practice, and years of developing your unique sound—all of it can now be uploaded to an AI platform in seconds. What happens when someone does exactly that without your permission?
The Nightmare Scenario
Imagine this: You share raw vocal tracks with a producer you trust. Maybe it’s for a collaboration, a demo, or just feedback. Then you discover he thought it would be “fun” to upload your tracks to Suno, an AI music generation platform, to create a “personality” based on your voice.
Your tracks. Your voice. Your decades of training that you paid for. Now feeding an AI system that will use your voice to train its models and generate outputs for anyone who wants to sound like you.
This isn’t a hypothetical. It’s happening to singers, voice actors, and creatives every day. And if it’s happened to you, you need to know that you have legal options.
Your Voice Is Your Property: The Right of Publicity
The most powerful weapon in a singer’s legal arsenal is the right of publicity, which is the legal right to control the commercial use of your name, image, likeness, and voice.
This isn’t new law. In the landmark 1988 case Midler v. Ford Motor Co., Bette Midler sued Ford after they hired one of her backup singers to imitate her voice for a commercial…..after Midler had explicitly refused to participate. The Ninth Circuit Court of Appeals ruled in her favor, holding that “when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs.”
A few years later, in Waits v. Frito-Lay (1992), singer Tom Waits won $2.375 million (including $2 million in punitive damages) when Frito-Lay used a voice impersonator in a Doritos commercial. The court established what’s now called the “Midler tort,” a legal test requiring proof of a voice that is (1) distinctive and (2) widely known.
Here’s what matters for today’s AI landscape: these cases didn’t require copyright in a voice, because voices can’t be copyrighted. They protected voices as part of a person’s identity. And that principle extends directly to AI voice cloning.
The New Wave of AI-Specific Protections
State legislatures have been racing to address AI voice cloning, and several new laws provide even stronger protections:
Tennessee’s ELVIS Act (Effective July 2024)
Tennessee became the first state to explicitly protect against AI-generated voice clones. Named after the King himself, the Ensuring Likeness, Voice, and Image Security Act:
- Establishes that every individual has a property right in their voice
- Prohibits publishing or distributing AI-generated replicas of someone’s voice without authorization
- Makes it illegal to distribute software whose primary purpose is generating unauthorized voice replicas
- Provides both civil remedies and criminal penalties
- Protects not just famous artists but also voice actors, podcasters, and anyone whose voice has commercial value
New York’s Digital Replicas Law (Effective January 2025)
New York now requires that any contract provision allowing the creation of a “digital replica” of a performer’s voice must meet strict conditions, including that the performer have legal representation and give informed consent. Provisions that don’t meet these requirements are void and unenforceable.
California’s AB 2602 (Effective 2025)
California requires that artists have union or legal representation before signing away rights to their digital selves. The law also prohibits commercial use of digital replicas of deceased performers without estate consent.
Suing the Producer vs. Suing the Platform: Two Different Battles
When someone uploads your voice to an AI platform without permission, you potentially have claims against two different parties: the person who uploaded your material and the AI platform itself.
Against the Person Who Uploaded Your Voice
Your claims against the producer (or whoever uploaded your material) are typically stronger and more straightforward:
Breach of Contract: If you shared those vocal tracks under any kind of agreement, even an informal one communicated through messages, emails, or verbal understanding, about how they could be used, uploading them to an AI platform likely violates that agreement. In the recent Lehrman v. Lovo case (2024), voice actors successfully argued that communications through Fiverr and the platform’s terms of service established enforceable contracts limiting how their recordings could be used.
Right of Publicity Violation: Using your voice to create an AI “personality” that can generate commercial outputs is exactly the kind of unauthorized commercial exploitation that right of publicity laws prohibit.
Misappropriation: Even without a formal contract, taking the commercial value of your voice without permission is actionable in most states.
Copyright Infringement: While your voice itself isn’t copyrightable, your recordings are. If the producer uploaded actual recordings that you own the copyright to, that’s potential copyright infringement at the input stage (using your copyrighted recordings to train the AI).
Against the AI Platform
Claims against platforms like Suno are more complex but not impossible:
Right of Publicity: If the platform knew or should have known that voices were being uploaded without authorization, and they’re commercially benefiting from those voices, they may share liability.
Continuing Violation Theory: In Lehrman v. Lovo, the court rejected the AI company’s argument that the statute of limitations had expired, reasoning that since the AI model continues to replicate the plaintiffs’ voices every time it generates new content, the violation is ongoing.
The Section 230 Question: Platforms often claim immunity under Section 230 of the Communications Decency Act, which protects them from liability for user-generated content. However, legal experts increasingly argue that AI-generated content doesn’t fit this model. Section 230 was designed to protect platforms as passive hosts of user content, and is not for situations where the platform’s own AI is actively generating content using misappropriated voices. Courts haven’t definitively ruled on this question, but the weight of legal opinion suggests platforms may not be able to hide behind Section 230 when their AI systems are doing the creating.
Federal Legislation on the Horizon
Congress is “working on” comprehensive federal protections. The most significant is the NO FAKES Act, reintroduced in April 2025 with broad bipartisan support and endorsements from SAG-AFTRA, major record labels, and even tech companies like OpenAI and Google.
The NO FAKES Act would:
- Create a federal property right in your voice and visual likeness
- Make it unlawful to create or distribute AI-generated replicas without consent
- Allow individuals to sue bad actors who create, post, or profit from unauthorized digital copies
- Establish a notice-and-takedown system similar to the DMCA
- Extend protections for up to 70 years after death
While the bill is still moving through Congress, it signals where federal law might be heading, and courts may look to it as evidence of evolving legal standards even before it passes and is signed into law.
What You Can Do Right Now
If someone has uploaded your voice to an AI platform without permission, here are concrete steps you can take:
1. Document Everything
- Screenshot any AI “personalities” or voice models created from your voice
- Save all communications with the person who uploaded your material
- Document what restrictions or understandings existed about how your recordings could be used
- Preserve evidence of your original recordings and their creation dates
2. Send a Cease and Desist Letter
Both to the person who uploaded your material and to the platform.
Demand:
- Immediate removal of your voice from the platform
- Deletion of any AI models trained on your voice
- Confirmation that your voice data will not be used in future training
3. File a DMCA Takedown (if applicable)
If your copyrighted recordings were used, you can file a DMCA takedown notice with the platform demanding removal of infringing content.
4. Report to the Platform
Most AI platforms have terms of service prohibiting uploads of content you don’t have rights to. Report the violation directly.
5. Consult an Entertainment or IP Attorney
An attorney can assess your specific situation and help you understand which claims are strongest under your state’s laws. Many offer free consultations for these cases.
The Bigger Picture: Protecting Your Voice Going Forward
This situation also highlights the importance of proactive protection:
- Be explicit in all agreements about AI and digital replica rights—even informal collaborations should have clear terms
- Include AI-specific restrictions when licensing your voice or sharing recordings
- Register your copyrights in your recordings (not your voice, but the fixed recordings)
- Document your distinctive vocal characteristics and the training/development that created them
- Stay informed about new state and federal protections
Conclusion: Your Voice, Your Rights
A voice represents years of investment—lessons, practice, development, and refinement. It’s what makes you you as a performer. When someone uploads your voice to an AI platform without permission, they’re not just being careless with a file. They’re taking something unique and special that belongs to you and potentially allowing it to be replicated infinitely for others’ profit.
The law is catching up to technology, and the legal landscape is increasingly favorable to artists. Between established right of publicity protections, new state laws specifically targeting AI voice cloning, and pending federal legislation, you have more legal tools than ever to protect your voice.
Don’t let someone else profit from your decades of work. Your voice is your livelihood, and the law increasingly recognizes it as your property.
Disclaimer: This blog post is for informational purposes only and does not constitute legal advice. Laws vary by state, and the legal landscape around AI is evolving rapidly. If you believe your rights have been violated, consult with a qualified attorney in your jurisdiction.
You Don’t Have to Navigate This Alone
Let’s be honest about what’s keeping you up at night.
You’re watching AI get better every month. You’re seeing voice clones, deepfakes, AI-generated songs. And you’re wondering: Will there still be work for me in five years? Am I signing away my future every time I take a job?
You didn’t spend years developing your craft to be fed into a machine.
The Creative Rights Club is a new membership community we’re building for creators who want to stop feeling powerless, and start feeling informed.
Imagine walking into your next negotiation with confidence. You know what the clauses mean. You know what’s negotiable. You know what to demand—and you have the language to demand it.
You’re not paranoid. You’re not angry. You’re professional, and that earns you respect in rooms where decisions get made.
Inside the Club, you’ll learn to:
- Read contracts and spot AI traps before you sign
- Understand your legal rights without needing a law degree
- Negotiate better terms with real scripts and strategies
- Protect your career for the long term—not just the next gig
AI is here to stay. The question is whether you’ll still have a place in the industry five, ten, twenty years from now.
The creators who thrive will be the ones who understand how the business is changing—and know how to protect what makes them irreplaceable.
For less than a couple of lattes a month, you get the knowledge that lawyers charge $500/hour to explain.
Your talent is irreplaceable. You are irreplaceable.
Learn how to make sure you’re treated that way.
Join the waitlist to be first in when we launch (and get founding member pricing).
Click here: Join the Waitlist
Related Resources:


