On February 17, there weren’t many people in the Oklahoma Capitol, and the hearing didn’t produce the kind of noise that large floor fights typically do. Nevertheless, a significant event took place. House Bill 3299, which would, for the first time in the state, treat an AI-generated lie about a person’s face or voice the same way older laws treat forgery or fraud, was advanced by the House Criminal Judiciary Committee with an unusual kind of quiet unanimity. There are no opponents. No theatrics on the floor. With just a few nods and a brief vote, the bill is getting closer to becoming actual law.
The measure’s author, Rep. Neil Hays, a Republican from Checotah, presented it in a way that sounded more like someone who had been watching the news at night and worrying about it than like a lawmaker bragging. He told the committee, “This technology is moving faster than most people realize,” a statement he has since repeated, nearly verbatim, in six different publications. It’s the type of phrase that, until you sit with it for a minute, seems unremarkable. The frightening aspect isn’t what AI is currently capable of. It’s the difference between its capabilities and what the average Oklahoman believes it can accomplish.
| Key Information | Details |
|---|---|
| Bill Name | House Bill 3299 |
| Author | Rep. Neil Hays, R-Checotah |
| Chamber of Origin | Oklahoma House of Representatives |
| Committee Vote | Unanimous (Criminal Judiciary Committee) |
| Date of Committee Passage | February 17, 2026 |
| Focus | AI-generated and deepfake media depicting a person’s name, image, voice, or likeness without consent |
| Misdemeanor Penalty | Up to 1 year in county jail; fine up to $1,000 |
| Felony Trigger | Financial harm exceeding $25,000, or extortion/coercion/blackmail |
| Felony Penalty | Up to 5 years in prison; fine up to $10,000 |
| Political Ad Rule | Mandatory disclosure of synthetic media within 45 days of election |
| Comparable Law | Tennessee’s ELVIS Act, enacted 2024 |
| Next Step | House Judiciary and Public Safety Oversight Committee |
In order to cause emotional, financial, reputational, or physical harm, HB 3299 would make it illegal to create and distribute digital or synthetic media of a person’s name, image, voice, or likeness without that person’s written consent. A misdemeanor charge carries a $1,000 fine and a maximum sentence of one year in county jail. However, if the damage exceeds $25,000 or the content crosses the line into extortion or blackmail, it becomes a felony that carries a maximum five-year prison sentence. The numbers seem thoughtfully selected; they are both high enough to be significant and low enough to pass.
A political advertisement is also included, and once the bill moves forward, it might be the most contentious section. Particularly within 45 days of an election, any digital or synthetic media used in political advertising would have to be made public. Content creators would have to provide a signed attestation to media advertising agencies.

AI in campaigns is not prohibited. That distinction was made clear by Hays himself; he is not arguing that candidates cannot use technology; rather, he is arguing that voters should be informed when they are viewing content that was created by a machine. It’s another matter entirely whether that subtlety endures interaction with campaign consultants.
This would not be Oklahoma’s only path. In 2024, Tennessee took the lead by enacting the Ensuring Likeness, Voice, and Image Security Act, also known as the ELVIS Act, in recognition of the state’s musical history and the unique vulnerability that songwriters and singers encounter when a voice is cloned. Some states have drawn more restrictive boundaries to safeguard public figures or minors. Hays appears to be arguing that protection shouldn’t be contingent on celebrity, which the committee did not oppose. A country music artist in Nashville has the same claim to her own face as a waitress in Tulsa.
In just a few years, there has been a noticeable shift in the deepfake discourse. Divorce cases, small-town scams, and phony voicemails from CEOs to accounting departments are all examples of what used to seem like a specialized issue—something for technology reporters and national security analysts. The underlying neural networks only need a sufficient number of images to be trained, according to the U.S. Government Accountability Office. A birthday video, an Instagram feed, a high school yearbook. The uncomfortable part is that.
It is unclear if HB 3299 will make it past the House Judiciary and Public Safety Oversight Committee. As legislation moves forward, First Amendment concerns often come up, and proponents of free speech will have an opinion on how “intent to harm” is established in court. For the time being, however, Oklahoma has taken a modestly noteworthy step: it has approved the first law that treats a synthetic lie about someone’s face as a crime deserving of punishment. The pace of technology is not slowing down. For once, it appears that the legislators are acting more quickly than anticipated.
Disclaimer
Nothing published on Creative Learning Guild — including news articles, legal news, lawsuit summaries, settlement guides, legal analysis, financial commentary, expert opinion, educational content, or any other material — constitutes legal advice, financial advice, investment advice, or professional counsel of any kind. All content on this website is provided strictly for informational, educational, and news reporting purposes only. Consult your legal or financial advisor before taking any step.
