For one Lancaster Country Day School teen, the fake nude images police say two classmates made of her felt like a personal violation deeper than using her pictures of happy memories without her permission.
It was a betrayal by a friend
One of the two boys now charged with creating 347 artificial intelligence-generated nude images of her and 58 other girls had been her friend for years. Forty-eight of the girls are students at Country Day.
"It just feels like he didn't value me at all," she said.
LNP – LancasterOnline spoke to the student, her mother and the mother of a second Lancaster Country Day School student depicted in the images about their experience at the epicenter of a deepfake scandal that rocked the Manheim Township private school.
LNP – LancasterOnline does not identify by name people who report being sexually assaulted without their permission. For this story, the student who spoke to LNP – LancasterOnline is referred to as student one, her mother as mother of student one, and the mother of the second student as mother of student two.
Police say the two boys took images of the girls primarily from the girls' social media profiles and used AI to alter their faces and make them appear atop nude bodies. The two boys – who haven't been named by authorities because they're under 18 – are facing multiple charges, including sexual abuse of children. Student one hasn't spoken to the boy she was friends with since it came to light he was accused of creating some of the nude images.
If she were to speak to him today, she'd have one question: Why?
Both boys had exchanged hundreds of the AI-generated images in a private chat room and with one another on the social media website, Discord, according to police.
Student one said she's not sure why the two boys chose the 59 girls and one adult as targets, but the mother of student two said the sheer number of affected girls makes it seem as though they used a "broad brush."
'You've allowed this to happen'
Student one and both mothers believe the number of students affected and images created could've been smaller if school employees had notified authorities of an initial allegation against one of the boys sooner.
Though a Country Day student reported a classmate created nude deepfakes of several female peers through Safe2Say Something in November 2023, a criminal investigation didn't begin until a parent of one of the girls depicted in the AI-generated images notified police in late May. Safe2Say Something is an online platform run by the state Attorney General's Office where students, staff and parents can anonymously submit tips and concerns about potential youth violence.
Once a tip is submitted to Safe2Say, it's vetted and sent to school administrators and/or law enforcement. According to Country Day's 2024-25 school handbook, administrators on the school's Safe2Say response team include school counselors, the assistant head of school, head of school and division heads, such as the head of the upper school.
From November 2023 to the start of the criminal investigation in May, images continued to be created. The school didn't notify law enforcement and didn't file a ChildLine report until June 4 – after the criminal investigation had already begun.
"There were some major red flags here," said the mother of student two. "All of this could have been avoided... as a mom you're like 'you've allowed this to happen.'"
School employees are mandated reporters who must alert ChildLine – a 24/7 hotline run by the state health department – of suspected child abuse. Some parents say Country Day could have reduced the number of images created and girls affected if it had immediately alerted ChildLine or law enforcement in November.
Lancaster County District Attorney Heather Adams determined the deepfake incident wasn't a reportable offense under the Child Protective Services law. More than half the parents and girls who were depicted in the AI-generated images aren't placated by that determination, though, and plan to continue to sue the school.
Navigating the trauma of sexual abuse
As a mental health counselor and director of forensic mental health at Keck Human Rights Clinic at the University of Southern California, Kristen Zaleski works directly with victims of artificial intelligence-generated deepfake nude images. Although she is not involved in the Lancaster Country Day School case, she recently spoke with LNP – LancasterOnline in general about the impact these images have on those she treats.
What is the impact of being depicted in AI-generated nude images?
Deepfake imagery of nude bodies created without the individual's consent is a form of sexual violence, said Zaleski, and can have life-altering traumatic impact.
"Digital sexual violence, even though their body may not have been touched in real life... that ruins careers, that ruins relationships, and it creates a stress response that results in a traumatic reaction," Zaleski said.
A victim of digital sexual violence under the age of 18 may experience even further stress to their nervous system and body as they're at an age where self-consciousness can be high and when many are working to understand their place in the world, Zaleski said.
What are common challenges for victims of AI-generated nude images?
Because artificial intelligence is a new concept for many – including some legislators and members of law enforcement – Zaleski said victims often feel dismissed and minimised.
"For the victim themselves, child or adult, this is a lifelong trauma," Zaleski said. "If it gets uploaded to the Internet, you lose control over it. And I think this is a fundamental distinction that a lot of policymakers and police officers don't really understand."
Even therapists are just starting to understand the impacts that AI-generated nude images can have on their patients as an increasing number of victims come into their offices, Zaleski said. And this digital violence, she said, is different from other types of trauma therapists encounter in their clients.
"When you're a victim of a rape, there's a start and an end time," Zaleski said. "When you're a victim of digital sexual violence, there's a start time, but it never ends."
What does treatment look like for victims?
Of all the forms of sexual trauma Zaleski has helped victims with in her two-decade career, she said digital sexual violence is the hardest form of trauma to recover from because "there's no end to the suffering and trauma."
Zaleski said the most successful treatment for victims of digital sexual violence is validation and helping the victim to see they should take no blame for the violence.
Society blames victims "for giving an image, for overreacting, for making this a big deal, all of those things," Zaleski said. "So in therapy, I really just want to validate all of the truths that this is a big deal, that it was not their fault."
Finding a group of sexual violence survivors can also be beneficial for victims, she said, as it helps to have others who can understand their trauma.
And, similar to coping with other incidences of sexual violence where one might feel frozen or victimised, Zaleski said it's important to help the client achieve movement or action. For example, with digital sexual violence where there is a lack of advocacy and understanding, she said survivors turn to advocacy in their schools and county, city or state government.
"We don't want this to happen to any other child," said the mother of student one. "Policies and procedures need to change, resources need to change so that this doesn't happen to somebody else's child because I wouldn't wish this on my worst enemy."
Families affected by the AI images asked the school for attorney-run mandated reporter training for all board members, leaders, faculty and staff; retention of a third-party full-time safety officer; and hiring of an information technology forensics firm to locate any existing images.
'A scary feeling'
The girls who were depicted in the AI-generated images aren't entirely sure who has seen the images or where they've been shared online. And, once online, even if deleted, the images may never truly be erased. The mother of student two said Country Day hasn't been up front with the girls about if and how the AI-generated images have been deleted online.
"Obviously I hope it doesn't resurface but if it did, that's a scary feeling that I think all the victims have to live with forever," said student one. "They're either found or they're gone."
Her mother is particularly worried about the image resurfacing at important milestones in her daughter's future like when she goes to college or starts a new job.
"All of a sudden these things pop up again and no one knows – are they real or are they not real," she said. "The most devastating part is what is the impact long-term on her life as a result of the failure of the school to report this."
Student one learned that the AI-generated images existed in May – months after they had been created – from a friend. She said she was upset and angry. Her mother recalls picking up her crying daughter from school that day. She didn't see the images until November.
That intense emotion had simmered over the summer months, only to return when the students had to confirm they were pictured in the AI-generated images with Susquehanna Regional Police Detective Laurel Bair.
"When all the girls had to go into the detective's office, it rekindled all those emotions and made me upset again," student one said. "It was both upsetting and relieving because at least I knew what they looked like."
Student one recalls stepping into the room where, on the table, sat a foot-tall stack of papers. Each girl had their own folder of images contributing to the stack. When Bair pulled out her file, she said it was "book-thick."
"I was probably one that had close to the most" images made, she said.
The mother of student two said reality set in for her daughter in the detective's office.
"Sitting in that room and looking through the pictures, it felt like such a violation of memories that your kid has and the pictures that your child put on her Instagram and then all of a sudden to see them – a happy memory portrayed in a horrible way," said the mother of student two while tears welled up in her eyes. "That doesn't leave you. It really doesn't."
And AI doesn't make it any less real for the students.
"It's you but with additions" that look like it could be the girl's body, said the mother of student two.
"It just feels like the deepest violation that took place," she said.
'Stronger together'
Over the months, as the girls have processed the violation of their privacy and – in some cases, friendship – student one said her classmates have had panic attacks so intense they had to leave class. Some have been depressed or stopped eating, she said.
The AI-generated images haven't affected student one quite as deeply. So far, student one said she hasn't sought counseling.
"I'm not a very self-conscious person," she said. "I don't really care. ... I know they're fake. It makes me upset to know other people might not know they're fake but as long as I'm content with knowing that they're fake. I mean, what's out there is out there. I can't really do anything about it."
But she and her classmates have leaned on each other for support.
"We share this bond that's obviously over a horrible event, but it made us stronger together," she said.
In May, when there were only a handful of girls identified in the AI-generated images, she said they were brought into meetings together, counseled by the administration together and soon – even if they hadn't known each other prior – they were close.
"It was just normal to go up to someone that was another victim if they looked upset or if you were upset and you would just be able to talk even if you've never talked to them," she said.
Teachers and students who hadn't been impacted stood by them, too.
"Teachers are very clearly behind the students," said student one.
On Nov 8, some teachers joined most of the 225 upper school students in walking out of class chanting "Hear us. See us. Acknowledge us." as they circled the Country Day school building.
"It was comforting to me and a lot of other victims to know that other students not necessarily understood but were willing to go against our school and our admin... to know that they're people behind you that really cared," said student one.
A day of community building that followed the walkout was poorly attended by students, she said, as most saw it as a "last resort" by the administrators who organised the event. Until the walkout, she said it didn't seem like the administration cared about the girls pictured in the AI-generated images.
"The girls have found their voices in supporting each other and knowing what's right and what's not right," said the mother of student two. "For kids this age to be able to definitely say 'that was unacceptable,' the way it was handled was unacceptable' and we're going to band together... It shows how strong these kids are."
Student one's friend group has also banded together against her former friend who is accused of creating some of the images.
"We don't talk to him," she said. "He's not our friend anymore and obviously the girls in my friend group – especially the ones who are victims – we're just done, like never speak to us again."
When only one of the two boys was suspected of creating the images in June, the boy who is now student one's former friend wasn't under suspicion. During that time he had continued to text her and even visited her house until early November. By mid-November she said she learned he was involved, but police hadn't publicly confirmed two students had created the images until December.
"That was pretty insane to me because I feel like if someone does something like that – how can you just go about your life," she said. "I guess he probably thought he wasn't gonna get caught."
She said she hopes both boys, who are the same age as her, face consequences for their actions, and receive some kind of therapy because "it takes a lot of emotional instability and mental instability to do this to so many girls."
"This is going to affect the victims for the rest of their lives so it should at least affect them for the rest of their lives," she said.
Where does the Lancaster Country Day School case stand now?
Here's the latest since two juvenile boys attending Lancaster Country Day School have been charged with using artificial intelligence to create fake nude photos of 59 girls and one adult. Most of the victims are Country Day students.
The latest
Matthew Faranda-Diedrich, an attorney with Philadelphia-based law firm Royer Cooper Cohen Braunfeld, represents more than half of the 48 Lancaster Country Day School students and their families impacted by the AI-generated nude images case.
Though Faranda-Diedrich had been working toward a settlement in a suit he'd begun against the Country Day School in early November, communication between the two parties has recently deteriorated. As a result, some parents are hoping to work with Nadeem Bezar, an attorney from the Philadelphia-based law firm Kline & Specter – the same law firm that reached a multi-million dollar settlement for a victim abused by former Penn State University assistant football coach Jerry Sandusky.
The parents had first discussed intent to work with Bezar in mid-November but put him on hold in taking over the case after Country Day announced both its head of school and board president would be leaving their positions.
What we know about the accused
Earlier this month, the Lancaster County District Attorney's Office filed charges against two boys in connection with the creation of AI-generated nude images of a total of 59 girls under 18 and one adult.
The DA's office has not released information about the two boys since filing the charges. The boys haven't been named because they're both under the age of 18.
Susquehanna Regional Police Detective Laurel Bair, who handled the investigation on the AI-generated nude images, hasn't responded to requests for comment since charges were filed.
What charges do the accused face
Both boys are facing the following charges filed through the Lancaster County District Attorney's Office.
– One count of criminal conspiracy
– 59 counts of sexual abuse of children
– 59 counts of dissemination of photographs
– 59 counts of possession of child pornography
– One count of dissemination of obscene materials to minors
– One count of criminal use of a communication facility
– 59 counts of possession of obscene materials depicting a minor
– One count of possession of obscene materials
What is the next step
The boys will face the charges in juvenile court. Cases handled in juvenile court, which focuses on rehabilitation and includes supervision until age 21, generally are not a matter of public record. – LNP, Lancaster, Pa./Tribune News Service