Disney Child Star ‘Broke Down in Tears’ After AI Was USED TO CREATE ABUSE IMAGES

Disney Child Star ‘Broke Down in Tears’ After AI Was USED TO CREATE ABUSE IMAGES

A Disney Channel child star told Sky News that she “broke down in tears” after learning that a criminal used artificial intelligence (AI) to generate sexual abuse photos of her face.

Kaylin Hayman, 16, arrived home from school one day to get a phone call from the FBI. An investigator informed her that a man living thousands of miles away had sexually assaulted her without her knowledge.

The investigator stated that Kaylin’s face was placed on photographs of adults committing sexual actions.

“I burst into tears when I heard,” Kaylin says. “This feels like an invasion of my privacy. It doesn’t feel true that someone I don’t know can see me in this way.” Kaylin was featured in the Disney Channel TV series Just Roll With It for numerous seasons before being targeted among other child performers.

“My innocence was just stripped away from me in that moment,” she tells me. “As a 12-year-old girl, I was heartbroken by those images. I felt very lonely because I had no idea this was a global crime.”

TWO MEN SHOT During Fresno BIRTHDAY PARTY; One in Surgery, Other Stable

However, Kaylin’s experience is far from rare. According to the National Center for Missing and Exploited Children (NCMEC) in the United States, there were 4,700 reports of photos or videos of children’s sexual exploitation created using generative AI last year.

AI-made child sex abuse photographs are becoming so lifelike that police experts must spend countless, upsetting hours determining which images are computer-created and which involve actual, live victims.

That is the role of detectives such as Terry Dobrosky, a cybercrime specialist in Ventura County, California.

See also  Tragic Accident on Saw Mill River Parkway Leaves One Dead, Police Investigate!

“The material that’s being produced by AI now is so lifelike it’s disturbing,” according to him. “Someone may be able to argue in court, ‘Oh, I thought that was genuinely AI-generated. I didn’t think it was a real child, so I’m not guilty. It’s weakening our existing laws, which is profoundly concerning.”

Sky News was allowed rare access to the Ventura County cybercrime investigative team’s nerve center. Mr. Dobrosky, a District Attorney investigator, shows me some of the discussion boards he monitors on the dark web.

ARIZONA MAN CHARGED WITH FIRST-DEGREE MURDER After Leaving Daughter to Die in Hot Car While Distracted by Video Games

“This individual right here,” he continues, pointing to the computer screen, “he goes by the name of ‘love tiny girls’…” His opinion is about how the quality of AI is improving. Another person expressed satisfaction with how AI has helped him overcome his addiction. And not in a way that will help you overcome your addiction, but rather feed it.”

Artificial intelligence is being used to create and consume pornographic imagery, and this is not limited to the dark web. In schools, youngsters have taken photos of their classmates from social media and used AI to superimpose them on naked bodies.

Five 13 and 14-year-olds were dismissed from a Beverly Hills, Los Angeles school while a police investigation was underway. However, in some areas, such as California, using artificial intelligence to make child sex abuse photos is not currently a felony.

Rikole Kelly, Ventura County’s deputy district attorney, is attempting to change that with a plan to submit new legislation. “This is technology that is so accessible that a middle schooler [10 to 14 years of age] is capable of exploiting it in a way that they can traumatize their friends,” she explains.” “And that’s concerning because this is so accessible and in the wrong hands, it can cause irreparable damage.”

See also  JFK School in Wayne Secures Students Amid Accidental Emergency Alert!

“We don’t want to desensitize the public to the sexual abuse of children,” she says further. “And that’s what this technology used in this way is capable of doing.”

Source

Leave a Reply

Your email address will not be published. Required fields are marked *