A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her
The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.
Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.
“That’s when I got angry,” the eighth grader recalled at her discipline hearing.
Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.



I mean, it’s dissemination of CSAM. He’s getting worse than a beating.
Since the offender is a child themself, it isn’t going to be treated as severely.
No, they usually are. Even kids sharing their own nudes with their SO privately have the book thrown at them, these kids are making CSAM of a third party and distributing them across the school.