Man facing multiple felonies for AI-generated illicit images of underage girls: police

Artificial intelligence and child exploitation
Austin police have arrested a man who they say posted AI-generated illicit photos of female juveniles on social media. Police say they've seen an increase in this type of crime in recent years.
AUSTIN, Texas - Austin Police have arrested a man who they say posted AI-generated illicit photos of female juveniles on X.
Police said they’ve seen an increase in this type of crime in recent years.
Jack Bullington case

Jane Bullington (Austin Police Department)
What we know:
Austin Police said 19-year-old Jack Bullington paid someone overseas to alter photos of teen girls, younger than 18, in a pornographic manner. Bullington then posted them on X.
Austin Police have identified 11 victims in this case, many of which Bullington apparently knew.
Court records revealed Bullington had nearly 100 altered images.
Police said he’s been arrested before for harmful display to a minor and harassment.
Bullington faces 10 charges of possession and promotion of lewd visual material depicting a child, one charge of possession of child porn, and one charge of promotion of child porn.
Bullington has bonded out of jail under the conditions he have no social media, continue therapy, take meds, and have no contact with minors.
Artificial intelligence and child exploitation
Dig deeper:
With artificial intelligence, anyone can manipulate a photo into something it’s not.
"I would say that the bounds of a criminal mind are endless, so any picture that somebody can get off of social media, they can alter in AI," APD Child Exploitation Unit Sergeant Russell Weirich said.
"We do have tools that we use to identify them," Sgt. Weirich said, "It is very difficult and it’s getting increasingly difficult to tell, but the victims are what really tell the tale, because they can tell you exactly where they were and when."
"Many widely available generative AI tools can be weaponized to harm children," National Center for Missing and Exploited Children Jennifer Newman said.
Newman said a service through NCMEC called ‘Take it Down’ helps remove nude or sexually exploitative imagery online.
"Perhaps your picture that you sent to someone is being threatened to be posted, maybe a generative AI photo of you is posted, even if you're unsure whether that image has been shared, we want to help and try to remove it, and this service can help," Newman said.
Newman said this is an increasingly common, disturbing treat.
"In 2024 alone, NCMEC saw a 1300% increase in cyber tip line reports that involve generative AI technology, going from 4,700 reports in 2023 to over 67,000 reports last year," Newman said.
In 2023, lawmakers amended the Texas Child Porn Possession Law to include AI-modified images. Sergeant Weirich said it has been helpful.
"Because there was a lot of this stuff going on that wasn’t technically illegal. It was really in poor taste and it was terrible for the victims, but we weren’t able to do a whole lot," Sgt. Weirich said. "We dealt with it the best we could until now we’re really starting to get traction on some of this legislation that we’ve been able to use and it really helps our victims and gives them some respite it what they’re able to do and start healing and getting over that trauma."
The Source: Information in this report comes from reporting/interviews by FOX 7 Austin's CrimeWatch reporter Meredith Aldis.