SPRING HILL, Fla. — Imagine the pain and horror of someone using a photo of a child, manipulating it to create a sexually explicit image.
Tampa Bay-area enforcement agencies are collaborating with federal lawmakers to close legal loopholes regarding child pornography created using artificial intelligence, or AI.
Investigators say it’s frustrating and difficult to keep up with the rapidly changing technology.
“There is one common denominator,” Pasco County Sheriff Chris Nocco said. “And that is protecting our kids.”
Nocco says he recently met with Congressman Gus Bilirakis, R-Spring Hill, to discuss the need to modernize and re-authorize the Kids Online Safety Act, otherwise known as KOSA. According to Nocco, necessary changes should address child erotica.
“Children having sex — AI generated — is a crime,” Nocco said. “But if it is AI-generated erotica — an 11-year-old girl in lingerie, used for the intent of sexual purposes, that’s not a crime.”
Nocco mentions that earlier this year, charges were pressed against 67-year-old Steven Houser for possession of child pornography.
But Nocco says some of the images of children wearing lingerie in suggestive poses, which had been created with artificial intelligence, they were unable to prosecute.
“If it’s artificially created erotica, that is not a crime,” Nocco said.
Shortly after Houser’s arrest, Hillsborough Sheriff Chad Chronister announced that an 18-year-old had been charged with using AI to manipulate women's photos, creating and posting pornographic images.
In that case, the alleged victims were adults, and detectives were able to apply existing revenge porn statutes.
“It’s all the same and carries the same penalties,” Chronister said. “Technology has become available in this state of Florida to use this to the detriment. And we’re gonna do everything we can to protect the victims and not only Hillsborough County but throughout the entire state.”
Nocco said under current law, it's also difficult to prove there's a real victim out there or whether the images are completely AI-generated. So law enforcement also wants to shift the burden of proof to defendants who would then have to prove the images aren't real.
Law enforcement agencies anticipate that any changes in the law will be challenged by free speech advocates and may ultimately reach the U.S. Supreme Court.
That means it could take months or even perhaps years for laws to catch up with technology that's moving much faster than that.
“There’s probably parents that are furious and listening to this and saying this is B.S. I agree 100%,” Nocco said. “These people are evil. They are predators. We need to get rid of them.”