‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • TORFdot0@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    6
    ·
    11 months ago

    It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      5
      ·
      11 months ago

      I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.

      There’s a huge potential for harassment though, and I think that should be the main concern.

      • TimewornTraveler@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        11 months ago

        first, relevant xkcd https://xkcd.com/1432/

        second,

        Nudity isn’t needed for people to sexually objectify you.

        do you really think that makes it less bad? that it’s opt-in?

        And even if it was, the majority of people are able to strip you down in their head no problem

        apparently this app helps them too