A 13-year-old girl at a Louisiana middle school got into a fight with classmates who were sharing AI-generated nude images of her

The girls begged for help, first from a school guidance counselor and then from a sheriff’s deputy assigned to their school. But the images were shared on Snapchat, an app that deletes messages seconds after they’re viewed, and the adults couldn’t find them. The principal had doubts they even existed.

Among the kids, the pictures were still spreading. When the 13-year-old girl stepped onto the Lafourche Parish school bus at the end of the day, a classmate was showing one of them to a friend.

“That’s when I got angry,” the eighth grader recalled at her discipline hearing.

Fed up, she attacked a boy on the bus, inviting others to join her. She was kicked out of Sixth Ward Middle School for more than 10 weeks and sent to an alternative school. She said the boy whom she and her friends suspected of creating the images wasn’t sent to that alternative school with her. The 13-year-old girl’s attorneys allege he avoided school discipline altogether.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      50
      arrow-down
      9
      ·
      2 months ago

      The article states that the police investigated but found nothing. The kids knew how to hide/erase the evidence.

      Are we really surprised, though? Police are about as effective at digital sleuthing as they are at de-escalation.

    • klugerama@lemmy.world
      link
      fedilink
      arrow-up
      23
      arrow-down
      2
      ·
      2 months ago

      What? RTFA. 2 boys were charged by the Sheriff’s department. They didn’t face any punishment from the school, but law enforcement definitely investigated.

    • pelespirit@sh.itjust.works
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      2 months ago

      When the sheriff’s department looked into the case, they took the opposite actions. They charged two of the boys who’d been accused of sharing explicit images — and not the girl.

    • juko_kun@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      2 months ago

      I mean, law enforcement doesn’t have enough resources to go after people making real CP.

      What makes you think they can go after everyone making fake CP with AI?

      • Phoenixz@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        They do have resources, especially in the US. They do go after real cp and people go to jail on a near daily basis for it.

        This too, could have been investigated better, which is kind of the point of the article

        Why are you so okay with child pornography? Checking your message history really shows you being completely fine with CP, yet you really have it out for the victim

    • troglodytis@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      6
      ·
      2 months ago

      Correct. They will not investigate it further than threatening the victims with persecution. The goal is that the victim doesn’t pursue it further.

      They don’t know how to properly investigate it, and they are not interested in knowing. The see it as both ‘kids being kids’ and ‘if this gets out it will give our town a bad name’.

      I’m glad the kid and her family aren’t letting this go!