• papertowels@lemmy.one
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    4
    ·
    edit-2
    1 year ago

    So this does bring up an interesting point that I haven’t thought about - is it the depiction that matters, or is it the actual potential for victims that matters?

    Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      1 year ago

      How I see it: creating fake child porn makes it harder for authorities to find the real ones.

      • papertowels@lemmy.one
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        That’s a good point. On the flip side, I remember there was a big deal about trying to flood the rhino horn market with fakes a few years ago. I can’t find anything on how that went, but I wonder if it could have that effect as well.

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that’s not what a CSAM ban is about. It’s about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can’t consent, so even the distribution or basic retention of this content violates a child’s rights.

      Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it’s attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.

      It’s a sensitive subject that most people don’t see nuance in. It’s hard to admit that pedophilia isn’t a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.

      With that said, we don’t have much of a description of the South Korean man’s offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it’s my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.