There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • Veraticus@lib.lgbt
    link
    fedilink
    English
    arrow-up
    186
    arrow-down
    6
    ·
    edit-2
    9 months ago

    I’m only going to answer the first part of your question, not the AI/generated part.

    No one really chooses what or who they’re attracted to; it kind of just happens to you. For example, you might be watching a TV show and someone gets lightly, comically spanked… and suddenly a light bulb goes off above your head and you think, “whoa, that might actually be kinda fun.” People are wired in ways we don’t understand to want things we don’t even know we want.

    To that extent, pedophiles are themselves victims of their own desires; there’s no “logic” behind it. It’s simply an urge they experience.

    Of course that doesn’t make succumbing to this urge excusable, and any children who are impacted are of course victims and the pedophiles, predators. But no one is training pedophiles in pedophile camp. It’s just humans being human, unfortunately.

    • HappycamperNZ@lemmy.world
      link
      fedilink
      arrow-up
      86
      arrow-down
      2
      ·
      9 months ago

      This is something many people fail to realize - while society hates it exists, it is just an urge the same as my desire for women. We have just grown as a society to say this isn’t right (correctly). There are many who have the urge, and don’t follow up on it but its still there and they are a victim as well.

      Fully agree though, this does not excuse those who act upon it, promote it or sell it.

      • CaptainEffort@sh.itjust.works
        link
        fedilink
        arrow-up
        35
        arrow-down
        2
        ·
        9 months ago

        Unfortunately I think it’s probably in the same vein as any fetish or preference, so completely out of their control.

        Obviously people who act on it are the scum of the earth, but those who simply battle with the urge I have nothing but sympathy for. I can’t even imagine how horrible it is to have to deal with that daily and never be able to do anything about it, or even really talk to anyone about it.

          • CaptainEffort@sh.itjust.works
            link
            fedilink
            arrow-up
            29
            arrow-down
            2
            ·
            edit-2
            9 months ago

            Acting on it is NEVER out of their control.

            As someone who doesn’t have to permanently stifle my desires for the entirety of my life, I’m not about to assume that. I have no idea the toll that could take on someone mentally.

    • Fredselfish@lemmy.world
      cake
      link
      fedilink
      arrow-up
      56
      arrow-down
      2
      ·
      9 months ago

      I have heard that kids that molest kids or end up fooling around a super young age can make them grow up wired to be attracted to young kids or teens.

      The real issue no one wants to address is people who have these desires and know it wrong have nowhere to turn to for help.

      Even If they haven’t abuse anyone come out tell a therapist or someone you are attracted to kids will and probably can get you locked up. There are those who never affend but a lot do because either a) they accept what they are and have no moral objections to it or b) can’t get the help needed to fight the urges and end up offending.

      As @Veraticus said there is no easy answer because it’s not a choice. Be like asking you why you like women or why people are gay. They wired that and unfortunately I don’t think you can cure it.

      We definitely need to address access to any kind porn of it and if someone offends we must lock them away for their own good. Not saying prison but somewhere they can be mentally evaluated.

      • Instigate@aussie.zone
        link
        fedilink
        arrow-up
        30
        arrow-down
        2
        ·
        9 months ago

        There is definitely a link between having experienced sexual abuse as a child without any therapy or counselling to help them make sense of it and then later on sexually abusing other children, but it’s not super clear-cut and definitely not predictable.

        • Fredselfish@lemmy.world
          cake
          link
          fedilink
          arrow-up
          18
          arrow-down
          2
          ·
          9 months ago

          Yes same with girls who are raped or molested become promiscuous but doesn’t mean all girls in that situation will. Definitely why we need better sex education in America so we can teach kids the signs if an adult being inappropriate and learn about their own bodies.

      • CeruleanRuin@lemmings.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        19
        ·
        edit-2
        9 months ago

        Acting on it is ALWAYS a choice. I don’t really give a fuck how a person is wired if they choose to exploit children in any way, including by possessing sexual imagery of kids.

        Therapists are reachable through a simple internet search. They’re not going to lock you up if you haven’t ACTED on it. Don’t give me that shit about “they couldn’t get treatment, so of course they had to look at kiddie porn” or tell me they have “nowhere to turn”. Bullshit.

        • Fredselfish@lemmy.world
          cake
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          9 months ago

          There is countless records of just that and even If there wasn’t what do you think would happen to your life if you came out and said you are a pedophile?

          You think you keep your job, house. You think you be banned from living near a school? These people live in secret even the ones who don’t offend for reason.

          Therapy can’t do much not like there a cure. I don’t know your preferences but let me ask did you choose to be attracted that way? If yes then means you can choose to be attracted to the other sex. See your logic doesn’t hold up.

    • XbSuper@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 months ago

      This is one of the most sane responses I’ve ever seen.

      I am one of those poor souls who has these urges, but has never, and will never, act on them.

      I’m willing to open myself to an AMA for anyone interested.

    • DingoBilly@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      9 months ago

      This is the most accurate answer, and the fact it’s all cultural/social is quite important as well.

      If you were born a few thousand years ago it may be completely reasonable to sleep with a kid. Hell the kid is probably your slave so you could literally do whatever you want with them.

      But just as I don’t understand certain fetishes or even just people attracted to the same sex, others won’t understand why people would be attracted to kids.

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        If you were born a few thousand years ago it may be completely reasonable to sleep with a kid.

        That simply is not correct. While kids have been having sex with kids since, like, forever, looking back to medieval Europe, or even Rome, it was not normal for adults to be having sex with children, especially prepubescent children. Marriage ages for people that weren’t nobility (e.g., getting married off to solidify political alliances) were typically early 20s. Children getting pregnant has always been a very dangerous proposition, since it’s much more likely to lead to maternal mortality.

        Part of the confusion about all of this is that we think people used to live compressed lives, since we see average lifespans in the 40s. But that’s not taking into account the ferocious infant mortality rates; if you survived into your late teens, you were probably going to make it to your late 60s/early 70s, even in the middle ages. We think that, since ‘average’ life spans were 40-odd years, that people must have been marrying very young, but the evidence doesn’t really bear that out.

      • Firipu
        link
        fedilink
        arrow-up
        2
        arrow-down
        7
        ·
        9 months ago

        Lol, did you really compare same sex attraction to pedophiles there? “I don’t understand both”… Wtf dude.

        • DingoBilly@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          You do realize that up until a few decades ago in most counties (and even today in some western countries) homosexuality was seen as immoral and illegal?

          So yes, it’s very comparable in that it’s highly socially and culturally defined.

    • Hjalmar@feddit.nu
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      I listened to an amazing podcast about this a while ago. It was some science dude helping people not to be attracted to children. If somebody wants to have a listen I can probably find a link, but the podcast was in Swedish

    • DLSchichtl@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      Add to that the fact that the age of consent and the morality around it are relatively new. For a lot of history, puberty meant ready. I am certainly not disagreeing with AoC laws, but I do find it funny that you could probably go back one or two hundred years and this whole concept would be largely foreign.

  • Deestan@lemmy.world
    link
    fedilink
    arrow-up
    48
    arrow-down
    3
    ·
    edit-2
    9 months ago

    To answer some of the questions:

    I cannot understand the attraction to kids.

    There was a TV interview with people who were seeking help for pedophilia. They described it as just plain horny sexual attraction that they knew they had to not act on. I guess people have different reasons, and some probably manage to rationalize it as “relationships” as you say.

    Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

    Whether it is a modified image of a real person, or a pure generated picture, they will fall under the same laws as for depicting it which is already uncontoversially illegal.

    How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    Hard, as there are many ways to describe nudity or encourage the generator to weigh towards nudity. “Person with visible thighs, no skirt” and such.

    Easier to leave nudity out of the training data, which is already common.

    Then hard again because anyone can throw together a new image generator trained on what they want and no word filters.

  • Dame @lemmy.ml
    link
    fedilink
    arrow-up
    37
    arrow-down
    5
    ·
    9 months ago

    I see people here attempting to equate it with a natural attraction and or a fetish. We all have internet access and can look it up. It is in the DSM-5 and ICD-10 both pedophilia and pedophilic disorder. Especially, in this more advanced age of medicine, science and society. If it was natural I believe the corrections would’ve been made and or strongly advocated for, it’s not. What is advocated is using terminology correctly, encouraging those that experience this to feel comfortable to tell their truth and seek help. I believe some of you guys comments are very dangerous and some of the upvotes and downvotes are concerning and makes it difficult to distinguish if you are in support of protecting children. The point is please don’t just blanket label it and compare it to things that aren’t harmful, illegal and consensual.

    • HelixDab2@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      If it was natural I believe the corrections would’ve been made and or strongly advocated for, it’s not.

      You mean corrections to the DSM? There are lots of things that are natural and have organic causes that are still in the DSM, and will remain in the DSM, because they’re problems for the individual and society. Homosexuality was removed because it was recognized that acting on that sexual orientation, with another adult that had the same sexual orientation, was not a problem for society or the individuals. You can’t do the same with child sexual abuse; children can’t consent, legally, morally, or ethically.

  • VelvetStorm@lemmy.world
    link
    fedilink
    arrow-up
    31
    arrow-down
    2
    ·
    edit-2
    9 months ago

    Edit: also don’t be sexist, call it what it is rape, not seduction. Just because it was a woman doesn’t make it not rape. Calling it anything else is doing a disservice to all of the male victims of female on male rape.

    If the AI porn is depicting real life underage people then it should be and is(in some places) illegal. Now I don’t like it and find it reprehensible but if it’s not depicting real-life people then it should be legal.

    I think if you are into that then you should be able to seek professional help without a fear of it ruining your professional and personal life, but if you are attracted to kids then it is your moral responsibility and obligation to not work with or be around kids.

    This is a mental issue and it should be treated like one and we should be trying to understand it and find ways to prevent and treat it.

    • SnausagesinaBlanket@lemmy.worldOP
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      9 months ago

      I was reading this, and it made me remember how a dude in Australia I believe bought an underage sex doll. It ended up being flagged somehow, and the government arrested the guy when it arrived. I have no idea what happened to the guy.

      Was this guy trying to control his urges by using this hunk of rubber, or is that a crime too? This is very edgy, and I am thinking some countries won’t bother with them, while others might incur the death penalty.

  • HelixDab2@lemm.ee
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    9 months ago

    There are multiple parts to your question. I’m going to try to break it down.

    First, there’s a difference between a pedophile and a child molester. Pedophilia is a sexual attraction to children, but it does not, by itself, require the person to take action. A child molester is a person that sexually assaults children. It’s the difference between being heterosexual, and being a rapist; you can be straight and still be entirely celibate.

    Child molesters may not be sexually attracted to children at all; some might be, but people that commit rape aren’t usually doing it solely for sexual gratification, although sex is definitely part of it.

    We don’t know how common pedophilia is because of how heavily stigmatized it is.

    You don’t understand how a person could be sexually attracted to children; the simplest way to explain it is to ask if you can understand how a man can be sexually attracted to another man. IIRC, most research indicates that pedophilia probably is a sexual orientation, much like being straight, gay, or bisexual is (except that there is no moral or ethical way for a pedophile to have a sexual or romantic relationship with a child; that is always both predatory and criminal). Do pedophile child molesters believe that they’re having a relationship? Some of them, yes. They’re able to delude themselves into believing that the child wants the attention and sex (really sexual assault), when they’re–probably–the one that has groomed the child in the first place.

    I cannot understand the attraction to kids. Even teens.

    I can. When I was a child, I was sexually attracted to my peers. 14yo kids are having sex with each other, so clearly they’re attracted to each other. As an adult, I can see women in their 20s as being sexually attractive, while still having zero interest in them (y’all seem really young, and not in a good way, if y’know what I mean). Sexual maturity isn’t a magical thing that happens when you hit 18 (or whatever the age of consent is where you live); it’s a sliding scale.

    Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

    I don’t think that you can make a person into a pedophile, any more than you can make a person gay. A person either is, or isn’t, a pedophile, and CG CSAM isn’t going to change that. So the question is, does CG CSAM make it more likely that a pedophile will end up sexually abusing a child? My intuition says that it will not, in the same way that the proliferation of pornography has not made sexual assault of adults more common. (Some research indicates that the availability of pornography has decreased rates of sexual assault.) Child pornography is illegal–in part–because it cannot be produced without causing real harm to children. CG CSAM doesn’t cause real harm to any person though; unless there’s evidence that it increases the rates of child sexual abuse, I don’t think that the squick factor is a reasonable basis for banning it. OTOH, adult pornography has generally led to a relaxation of sexual mores and norms–which I believe is a generally positive thing–and it’s possible that CG CSAM would normalize child sexual abuse sufficiently that libertarians would be able to severely weaken age of consent and statutory rape laws. I don’t really know, TBH; I’d want to see more research rather than reflexively banning it.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      Articulate and concise. I like it.

      I don’t really have anything more to add, except that we as a society need to make up our collective minds on cg and AI csam/CP soon.

      I definitely think that cg/AI content is less bad (still bad, but less so) because there’s no harm being done to real children from it; but those AI image engines needed to be trained on some form of content to be able to generate the images that they do, so I’m not sure how the training images factor in, or what would even be used for training that AI… I know rather little about how AI is trained at the moment, so I’m not sure if it can be done without source csam material or not… IMO, that factors into the morality of the output.

      I definitely agree that sexual preference toward minors (aka paedophilia) is just that, a sexual preference; and that, in and of itself does not make someone a sex offender in the same way that being heterosexual doesn’t make you a rapist or other form of sexual “deviant” (or however you want to say that. It’s interesting to me to think that paedophiles may have a semi-legal way of getting porn for themselves (which causes no harm to children). I feel a bit bad for paedophiles in that they’re basically forced to have relationships with persons that they don’t find very sexually attractive, else they break the law. Not bad enough that I think that laws should change our anything, it’s just a crap situation. It would be like having a preference towards men, as a man, in a world of heteros. The men are there and you’re interested in them, but none of them are interested in you. Almost always that’s not the case, there’s other homosexual men that exist, no matter how rarely… in the case with pedos, there are exactly zero underage people who they can interact with at all sexually. I still don’t think that should change, but at least with the internet, a gay man can go and find porn that interests them. With pedos is literally a crime to even look at, possess, or make any porn that appeals to them.

      I can sympathize with the impossibility of their situation, that’s all. For the record, I’m just done cis male with no interest in anyone too young to date. I can recognise their attractive qualities without being attracted to them (speaking mostly about those that have reached their sexual maturity here, who are still not 18 or whatever)… I can understand it, I’m not so hateful to want anyone who feels attraction to young people to die or anything, but young people don’t have the experience to understand the situation they’re getting into, when they’re being mislead or gas lit, etc (though to be fair, a lot of "mature people seem to not know either, but that’s another discussion)… Fact is, they’re shit out of luck.

      I’m sure many are forced into celibacy just to be lawful. I don’t think any grown adult wants to be forced to be celibate; so I can understand the plight. AI/cg porn, tailored to that specific preference may give pedos an outlet that they can utilize to temper their urges and keep them on the right side of the law here. Of course it won’t solve the problem entirely, the same way that rapists are still a thing, but it may severely reduce illegal activity and harm to children.

      But I agree, it’s a slippery slope (so to speak) because it can easily evolve into lowering the age of consent, and bringing back child marriages and such. Which IMO, isn’t a desired outcome. I also don’t think that content should intermingle with either social networks or existing porn sites, since it’s so specific, it should be relegated to specific sites and not left flapping around the internet. It’s also a vast minority of people that are afflicted, so segregation may be a minimum measure to keep things somewhat clean. I know I don’t want AI generated CP content mixed in with my usual porn browsing… I’m sure there’s plenty of people in the same boat, so IMO that’s a minimum. But I’m only one voice in the society, so I don’t make the decision; I’m interested to see what decision is finally made and implemented, whenever we get there.

      As a disclaimer: I’m not attracted to underage people. I’m also not a doctor or scientist, or psychologist or anything else. I’m not in favor of anything here, besides society making a decision, and I’m just positing that it could be beneficial to society as a whole. I welcome other opinions, except those by people whom are heavily religious. Good day.

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        I feel a bit bad for paedophiles in that they’re basically forced to have relationships with persons that they don’t find very sexually attractive, else they break the law.

        I’ve seen a paper–which I unfortunately did not bookmark–that seemed to indicate that most pedophiles were not exclusively pedophiles; many are able to have romantic and sexual relationships with age-appropriate partners. They also tend to have a distinct gender preference for minors (e.g., a person that is a heterosexual and a pedophile will prefer minors that are in-line with their sexual orientation). The ones that are ‘pure’ pedophiles–not sexually attracted to any adults at all–do not seem to have a gender preference, which kinda makes sense when you consider secondary sex characteristics as markers of physical maturity, e.g., young boys and girls look physically very similar aside from the genitals themselves. Again - I don’t have the reference on this saved, so I might be misremembering, or misrepresenting it, but this is what I recall.

        there are exactly zero underage people who they can interact with at all sexually.

        There’s a genetic disorder–I believe exclusively in women–where they don’t ‘grow up’; they don’t get very tall, they’re largely lacking in secondary sexual characteristics, and I believe that they’re infertile. I ran into a woman like that–who was with her partner–at a fetish event. It really gave me whiplash, because at first, second, and third glances she looked like she was 12, at an event that had explicit sexual activity in the open. It took a closer look at her face to realize that she was in her 30s. God bless her, she found someone that was attracted to her, and into the same shit she was into. So, y’know, there’s that.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          I’ve heard of that genetic condition. It’s fascinating, and as far as I know, extremely rare.

          I know that at least one has spoken publicly about her experience, and they touched on dating and the implication was that most of the people that are interested, are paedophiles, and that didn’t sit well for her, and I expect that wouldn’t sit well for most people, especially those with that condition.

          Fascinating information all around. I don’t have a doubt that is accurate.

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            9 months ago

            She’s got a catch-22 there; if she doesn’t want to date anyone that’s attracted to her because they’re likely a pedophile, then she’s not going to ever be able to have any romantic relationships (assuming that she wants them). I guess if that were me, I’d rather date a person that was sexually and romantically attracted to me–despite knowing that they were also sexually attracted to minors–than live life completely alone.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              That was the take away. She was rather upset about it, which is apparently good for ratings but tragic overall.

              I suppose it depends on what she really wants in life, which I won’t presume to know. I wish her the best, that’s not a fun condition to deal with.

    • Chailles@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      I don’t think that you can make a person into a pedophile, any more than you can make a person gay.

      If you really think about it, we’ve seen arguments like that before. That pornography creates rapists. That violent video games creates murderers. And that’s just strictly on the consumption of media.

  • cooopsspace@infosec.pub
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    3
    ·
    edit-2
    9 months ago

    At the end of the day, art is just pixels on a flat surface. Determining whether a depicted individual is under age where it’s not obvious sets a dangerous precedent. Is the picture 17 or 18? Who knows.

    But the problem is that people have been sexualising people like Emma Watson since she first appeared on screen. That’s not okay and rather than sending AI art underground I think society needs to change to normalise education about sex, reproduction and genitalia and address the social issues to treat pedophilia like the disease that it is.

    Meanwhile pedophile names are being written about publicly, risking mob violence and further isolation. Not to mention in the US theres a lot of negative attention being put on women’s reproduction, childrens sex ed and genitalia and a push to make the whole lot illegal and taboo. Not to mention people teaching their kids pet names for their parts, “uncle Ben touched my heehaw” sounds a lot different to “uncle Ben touched my penis”.

    Society is a problem, the US particularly is going in the wrong direction on many aspects of sex education.

  • Izzgo@kbin.social
    link
    fedilink
    arrow-up
    24
    arrow-down
    2
    ·
    9 months ago

    To me there is a clear difference between children, and teens say 16+. It is both morally wrong and unnatural to be attracted to prepubescent children, and this is pedophilia. But basically, by definition puberty makes people become sexually attractive, and it’s natural for adults to be attracted. Still morally wrong to act on those attractions unless you’re in about the same stage of puberty or early adulthood. That’s when we rely on a strong moral code and laws in society to protect youngsters who have recently gone through puberty. And hopefully even after the laws no longer apply, we have enough societal pressure to strongly discourage wide age gaps between sexual partners.

    Pedophilic disorder is characterized by recurring, intense sexually arousing fantasies, urges, or behavior involving children (usually 13 years old or younger).

  • Astroturfed@lemmy.world
    link
    fedilink
    arrow-up
    25
    arrow-down
    3
    ·
    9 months ago

    Don’t try to understand it. You aren’t going to get a good answer. It’s a horrible mental illness level of sexual preference.

    Anything can be sexualized with enough impulse and experiences. Everyone’s got some weird dark fetish shit. Some of it’s illegal in practice. Normal people bury that shit or only discuss it in therapy. While talking it out so they can hopefully never think about it again.

    I’m sure there’s different answers to this just like “why are there serial killers?”. Just be glad it confuses you.

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    2
    ·
    9 months ago

    My assumption on the attraction thing is that there are many things that cause attraction. Guys generally go for younger women. Well what do younger women have, tighter bodies, firmer breasts, more fit, healthier looking hair and skin, more defined hips… but as we all know many guys specialize in being attracted to a few of these things. Well, children usualy have healthy skin and hair… so if a guy is attracted to just the attributes that don’t require puberty, I can imagine that attraction wise he might not feel such a difference. Now mix that in with wanting to feel superior and some of the other things like that and kids start to fit well. Now add in a high libido and low self control. Disaster. Take the same guy and add in an attraction to big boobs and he is close to average cause kids don’t have those.
    Basically, it only takes a few missing screws. As for women. Teenage boys have a lot of sexual energy and passion. I can imagine that being attractive. Plus there is of course the taboo of it that appeals to women just like any other kink. Put them in a space where they aren’t getting thier needs met by men, and give them access to boy. Disaster again. In the end, the diversity of humans means there will always be someone into anything you can imagine.

    • Not_Alec_Baldwin@lemmy.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      4
      ·
      9 months ago

      I think you’re missing the point, at least as far as I understand it.

      Child predators experienced some kind of trauma, and as a result they never developed. That could be external trauma (abuse) or internal trauma (thoughts, mental illness) but as their body “grew up” and they began developing sexual urges, they never matured.

      Think about your first crush. They were your age, probably, unless you had a crush on Jennifer Connelly like every other millennial boy. As you grew up your crushes were probably always within a few years of you. It’s just how it works.

      In “minor attracted people” (I hate that term, but it works for criminals AND non-criminals so it’s valid) the attraction just doesn’t get updated.

      Humans are REALLY BAD at controlling our impulses. Especially impulses that are taboo. Especially biological impulses (eating, sex, learning, games, etc).

      So these people go into fields where they can be around kids. And then an “opportunity” arises and they either can’t fight their criminal impulse or they rationalize their criminal behavior.

      And boom. Traumatized kids and teachers in jail.

      We need to get more people therapy.

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    9 months ago

    I imagine there’s some kind of vampiric quality to it. Kids are full of youth and innocence: things we are all constantly losing to time. Especially if someone’s own childhood was robbed from them, I think they will carry around a void they desperately want to fill but never can, because of course abusing a child doesn’t bring these things back to you. Still, many child abuse victims go on to abuse other children later in life, and this may be their drive: to seek the thing they lost. It’s beyond sad. Abusing children is straight up disgusting and terrible, but the convoluted desperation that causes people to do it is truly horrifying in a stranger-than-any-fiction kind of way.

  • Apepollo11@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    9 months ago

    I’m only going to tackle the tech side of this…

    How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    Easy. The most popular apps all filter for keywords, and I know that at least some then check the output against certain blacklisted criteria to make sure it hasn’t let something slip through.

    But…

    Anyone can host their own version and disable these features, allowing them to generate whatever they want, in the exactly same way that anyone can write their own story containing whatever they want. All you need is the determination to do it, and some modicum of ability.

    People have been been creating dodgy doctored photos long before computers. When Photoshop came out, it became easier, and with AI it’s easier still. The current laws about creating and distributing indecent images still apply to these new images though.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Technically the diffusers all have the ability to filter any material from the actual outputs using a secondary CLIP analysis to see if it kicks out any keywords which indicate that a topic is in the image. From what I have seen, most AI generation sites use this method as it is more reliable for picking up on naughty outputs than prompt analysis. AI’s are horny, I play with it a lot. All you have to do is generate a woman on the beach and about 20% of them will be at least topless. Now, “woman on the beach” should not he flagged as inappropriate, and I don’t believe the outputs should either because our demonization of the female nipple is an asinine holdover from a bunch of religious outcasts from Europe who were chased our for being TOO restrictive and prudish, but alas, we are stuck with it.

        • Adalast@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          You are correct, CLIP can misinterpret things, which is where human intelligence comes in. Having CLIP process the probabilities for the terminology that you describe what you are looking for then utilizing a bit of heuristics can go a long way. You don’t need to train it to recognize a nude child because it has been trained to recognize a child, and it has been trained to recognize nudity, so if an image scores high in “nude” and “child” just throw it out. Granted, it might be a picture of a woman breastfeeding while a toddler looks on, which is inherently not child pornography, but unless that is the specific image that is being prompted for, it is not that big of a deal to just toss it. We understand the conceptual linking so we can set the threshold parameters and adjust as needed.

          As for the companies, it is a tough world surrounding it. The argument of a company that produced a piece of software being culpable for the misuse of said software is a very tenuous one. There have been attempts to make gun manufacturers liable for gun deaths (especially handguns since they really only have the purpose of killing humans). This one I can see, as the firearm killing a person is not a “misuse”, indeed, it is the express purpose for it’s creation. But what this would be would be more akin to wanting to hold Adobe liable for the child pornography that is edited in Lightroom, or Dropbox liable for someone using Dropbox API to set up a private distribution network for illicit materials. In reality, as long as the company did not design a product with the illegal activity expressly in mind, then they really shouldn’t be culpable for how people use it once it is in the wild.

          I do feel like more needs to be done to make public the training data for public inspection, as well as forensic methods for interrogating the end products to figure out if they are lying and hiding materials that were used for training. That is just a general issue though that covers many of the ethical and legal issues surrounding AI training.

  • Wugmeister@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    9 months ago

    There are two parts to this problem.

    For kids who haven’t hit puberty, there is a diagnosable pedophilia disorder. This is mostly genetics. (I’m pretty sure I’ve met an alpaca that was a pedophile once.) The molester’s brain is wired wrong. Nothing to do about that. IMHO, they deserve pity as long as they keep their hands off the children.

    For teenagers, the attraction is the power dynamic. Teens have a rather distorted view on what is attractive, and they tend to be naive and easily manipulated. On top of this, almost all teenagers have next to no impulse control, and many will make very very bad decisions (even knowing that the decision is bad) if doing so might result in some form of dopamine hit via sex/adrenaline rush/video games/peer approval/etc. Adults that seek out teenagers for sexual relationships are bad people who chose to be a groomer. There is no genetic component to being a groomer, and they don’t deserve pity.

    Btw, I can flesh out my claim about the alpaca if you want, but it will have to have a tw for adorable fluffy animals suffering a horrifically slow and painful death.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      Info link: https://pubmed.ncbi.nlm.nih.gov/18686026/

      The DSM-V specifies 2 types of pedophilia, Pedophilic (victim age <11) and Hebophilic (victim ages 11-14). What you are describing for the grooming is generally not pedophilia because “children” older than 15 are generally considered post-pubescent and thus anatomically adults. Their frontal lobes still have a LOT of time needed to cook to completion, but they have the impulse control issues for a reason, from an evolutionary standpoint. Yes, in modern society, “adults” who take advantage of the still-developing prefrontal cortex of a post-pubescent adolescent is a shit human being who doesn’t deserve to be a member of society, but they are technically not pedophiles, at least not clinically. Legally is a different story, but that is not a pertinent area of discussion right now.

      Pedophilic and Hebophilic individuals generally do not ever take their impulses to the realm of reality. Most of them actually end up feeling so much shame and remorse over even having the thoughts that they commit suicide. They definitely deserve pity and treatment, not stigmatization and ostracization.

      As to the OP asking about AI art that depicts underage individuals in states of undress or sexual situations, ALL depictions of underage individuals in those contexts are illegal. By the letter of the law, if you draw stick figures on a piece of paper having sex, then label them as children, you have created child pornography. No depiction is legal, no matter the medium. AI-generated, hand drawn, sculpted, watercolors, photos, under the law in (I believe) every state, they are all identical. Personally, I believe that this is asinine and 100% indicates that the purpose of these laws are to adjudicate morality, not “protect the children” as all of the people who push on them claim, but that is just my opinion. Hand-drawn artwork that has no photographic source material and does not depict real people has virtually 0 chance of having caused harm to any children, and AI just knows what the keywords mean in the context of reversing the vaporization of an image. They weren’t trained on kiddy porn, the we’re trained on pictures of children, and pictures of adults doing their porny thing, so they are able to synthesize the two concepts together.

  • Candelestine@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    9 months ago

    I don’t know many of the answers to these questions, I’m no social scientist or doctor. On the tech side of things, this is all very new and we are still coming to grips with it.

    I feel pretty comfortable fielding this one though: There should be no exceptions granted for AI generated pornographic content, and a person’s facsimile should have the same protections as actual photos of them. I do not think many people will find this controversial among the general public, as it would pretty clearly serve to protect all of us from having our identities used against our will.

    I expect that even our congress should be able to get at least something on the books in this direction. Eventually. Maybe. Or at least the FCC or something.

    • Delphia@lemmy.world
      link
      fedilink
      arrow-up
      26
      ·
      9 months ago

      Let me preface this by saying I DO NOT SUPPORT CSAM!

      The only issue I take with AI generated images is that theres no true “age” to the picture and any legislation that would allow people to be jailed or charged would have to be worded very carefully.

      “Depicting clearly underage subject matter if it were a person or using prompts to generate someone who clearly appears under aged” simply because someone could be marked for life for typing in “naked elf” and the program spits out something with small boobs and childlike features and not having their HD shredded immediately.

        • Delphia@lemmy.world
          link
          fedilink
          arrow-up
          18
          ·
          9 months ago

          Thats exactly my point. Sure the courts may rule in your favor eventually but you just got marched out of work in handcuffs for possession of CSAM, your entire personal and professional circle knows and any explanation you offer is going to sound like total bullshit.

          “It was an AI generated image, and it was an elf! She just looked young, but not like illegal young! Guys you have to believe me!”

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        9 months ago

        “Depicting clearly underage subject matter if it were a person or using prompts to generate someone who clearly appears under aged” simply because someone could be marked for life for typing in “naked elf” and the program spits out something with small boobs and childlike features and not having their HD shredded immediately.

        Has that ever happened though? I don’t think it happens as much as people imagine it does. This is an issue with any CP, not just ai generated stuff.

          • FuglyDuck@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 months ago

            So…. 1) they were obviously child like.

            However, while the cartoon characters were elves and pixies, they were also clearly young elves and pixies, which led to concerns the images were linked to child sexual abuse.

            1. he didn’t just view it, he downloaded it and kept it in a spank bank for 3 years.

            Ronald Clark downloaded the Japanese anime cartoons three years ago, setting in train events that would see him in court in Auckland and jailed for three months for possessing objectionable material

            1. he had prior convictions for sexually assaulting a 12 year old boy- and I’m guessing there’s parole agreements to not have CP material.

            Clark has previous convictions for indecently assaulting a teenage boy and has been through rehabilitation programmes, but the video nasties he was watching in this case were all cartoons and drawings.

            1. it was for the artistic merit! Uh huh. I watch porn for the story too! /s

            Clark admitted he was interested in the images but he said it was for their artistic merit and as “a bit of a laugh”. He did not find them sexually arousing, he said.

            I think it’s safe to say he didn’t get convicted for simply viewing a search result. So I stand by what I said: it doesn’t happen as often as people think it does. Even if you come down on one side… that’s one instance in a global world.

      • Candelestine@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        10
        ·
        9 months ago

        It’s not congress’ job to make perfect laws, that’d be even less efficient than the one we have now. The interpretation and “fine-tuning” of law is the job of courts, and is literally handled case-by-case and in the hands of judges and literal armies of lawyers every year.

        • Delphia@lemmy.world
          link
          fedilink
          arrow-up
          15
          ·
          9 months ago

          Ok, when someone gets arrested for possession of CSAM because someone decided those pixels looked a little young and their entire life comes crashing down around their ears, the arrest makes the papers and its forever out there. I’m sure they wont mind at all because after they had to financially ruin themselves to beat the charges in court they were found innocent.

          Or

          The law gets written well by people that we the people pay to do their job and write the laws.

          • Candelestine@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            2
            ·
            9 months ago

            Except everyone handwaves away the whole “written well” or “carefully written” part, because there’s no actual good place to draw that line. Which is why you and nobody else can think of one. It will always be eventually up to some judges/juries subjective interpretations, regardless of what line you gave them to work with.

            So, they’re just going to ban it all. They do not consider digital art collections to be of very great importance, they’re more interested in things that involve lots and lots of money, generally. Or things that can be used to stoke fear, so they can come in and get a legislative notch on their belt “saving the day” with some half cocked fix.

            • CeruleanRuin@lemmings.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              9 months ago

              Just to get it straight, by “digital art collections” are you talking about that folder of anime girls on your computer?

              • Candelestine@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 months ago

                No, when I say they don’t care about digital art collection, I really mean all digital art. They won’t ban all digital art, of course, they’re just going to make no exceptions for AI generated artwork when considering what is or is not banned. Similarly, using someone’s facsimile, which is not currently illegal afaik, should be.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      7
      ·
      9 months ago

      Even entirely fake/generated characters.

      CSAM is one of the tools groomers use to groom kids. “See this is normal. Here look at this. Doesn’t that look like fun?”

      There are no easy answers here. The issue is complex and extremely difficult, and I don’t think anyone has it figured out.

  • Ganbat@lemmyonline.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    9 months ago

    Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

    That’s where things get difficult. An episode of Law & Order: SVU tried to tackle this question a long time ago (but with Photoshopped fake CSAM) and the answer was a resounding “I dunno.”

    On the one hand, it’s disgusting, deplorable, etc. On the other, a fake image means no one was victimized for it.

    Does the content further radicalize these people, creating further risk of them victimizing a child, or does it sate their desires, helping to prevent them from victimizing a child? These questions are incredibly difficult to actually answer, and no answer can ever really be definitive, as you can’t really predict how any one person might react.

  • Dame @lemmy.ml
    link
    fedilink
    arrow-up
    18
    arrow-down
    13
    ·
    9 months ago

    AI CSAM should absolutely be treated as such. The model has been trained on images of real human children. I’m not sure where the issue comes from I would imagine power. Id need to check peer reviewed work from those in the field but I honestly can’t stomach it.

    • surewhynotlem@lemmy.world
      link
      fedilink
      arrow-up
      21
      arrow-down
      2
      ·
      9 months ago

      What about an artist just drawing it? Is that ok?

      Or no, because the artist has seen children before?

      • Maeve@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        It used to be nothing for parents to take pictures of their kids playing in the bath. Parents have been convicted and lost their children for it, though.

      • Axxys@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        10
        ·
        9 months ago

        It’s not OK make CSAM.

        The origin of CSAM does not make it acceptable.

          • Vedlt@lemmy.world
            link
            fedilink
            arrow-up
            9
            arrow-down
            4
            ·
            9 months ago

            I am not an expert in any field relating to any of this by any means, but we can all agree that CSAM is unequivocally reprehensible. Thusly many people will have severe issues with anything that normalizes it even remotely. That would be my knee jerk response anyway.

            • CaptainEffort@sh.itjust.works
              link
              fedilink
              arrow-up
              12
              arrow-down
              1
              ·
              edit-2
              9 months ago

              Well maybe we shouldn’t base our decisions on knee jerk responses.

              Imo if nobody’s being hurt then it’s none of our business. If it helps these people to deal with their urges without actually hurting anyone then I think that’s unquestionably a good thing.

              • Slowy@lemmy.world
                link
                fedilink
                arrow-up
                5
                arrow-down
                4
                ·
                9 months ago

                If it is in fact helping them, yes. It would be ideal to do a study of how it affects their self control before going that direction though I think, as some argue it would do the opposite.

                • CaptainEffort@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  10
                  ·
                  9 months ago

                  If it is in fact helping them, yes

                  Okay so… we agree?

                  And yes, some would argue the opposite. But I don’t think we should be creating laws without any actual proof one way or the other.

                • CeruleanRuin@lemmings.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  10
                  ·
                  edit-2
                  9 months ago

                  It almost certainly “helps” as many of these people as it encourages. The hedonistic effect is a phenomenon common to all humans, where a person indulging heavily in something that makes them feel good needs more and more extreme examples of it to maintain the baseline of satisfaction from it. Any harmful compulsion when indulged will fall victim to this effect.

                  Providing virtual explicit images of children might mollify some, but it will have an inflaming effect on just as many others, who will seek out increasingly realistic or visceral imagery, up to and including looking for real photos and/or exploiting real children. That in turn ensures a market for child exploitation.

                  So no, it’s not harmless. Not remotely.

        • surewhynotlem@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          9 months ago

          Yes, but it’s wrong for very different reasons and severities. Murder vs murder porn, if you will. Both are bad and gross, but different, and that matters.

          But that’s irrelevant to my question, which no one actually answered.

          I am curious about people’s take on the difference between human creativity from memory vs AI “creativity” from training. The porn aspect is only relevant in that it’s an edge case that makes the debate meaningful.

          There are laws today that you can’t copyright AI art, but we can copyright art that’s based on a person’s combined experiences. That seems arbitrary to me, and I’m trying to understand better.

          • AmidFuror@kbin.social
            link
            fedilink
            arrow-up
            2
            arrow-down
            3
            ·
            9 months ago

            There are also pedos pretending to be against AI generated child porn to cover their tracks.

      • Dame @lemmy.ml
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        9 months ago

        If the artist is drawing naked children that isn’t for the sake of a book or something of similar nature there is a problem. This is also a disingenuous comparison an artist hasn’t been trained on hundreds to millions of children’s images and then fine tuned. There’s a lot of illegal content these models come across and then are hopefully tuned by human hands. So try another example