cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

  • @laughterlaughter@lemmy.world
    link
    fedilink
    English
    11 month ago

    I mean… regardless of your moral point of view, you should be able to answer that yourself. Here’s an analogy: suppose I draw a picture of a man murdering a dog. It’s an animal abuse image, even though no actual animal abuse took place.

    • @Daxtron2
      link
      English
      330 days ago

      Its not though, its just a drawing.

      • @laughterlaughter@lemmy.world
        link
        fedilink
        English
        130 days ago

        Except that it is an animal abuse image, drawing, painting, fiddle, whatever you want to call it. It’s still the depiction of animal abuse.

        Same with child abuse, rape, torture, killing or beating.

        Now, I know what you mean by your question. You’re trying to establish that the image/drawing/painting/scribble is harmless because no actual living being suffering happened. But that doesn’t mean that they don’t depict it.

        Again, I’m seeing this from a very practical point of view. However you see these images through the lens of your own morals or points of view, that’s a totally different thing.

        • @Daxtron2
          link
          English
          330 days ago

          And when characters are killed on screen in movies, are those snuff films?

          • @laughterlaughter@lemmy.world
            link
            fedilink
            English
            230 days ago

            No, they’re violent films.

            Snuff is a different thing, because it’s supposed to be real. Snuff films depict violence in a very real sense. So so they’re violent. Fiction films also depict violence. And so they’re violent too. It’s just that they’re not about real violence.

            I guess what you’re really trying to say is that “Generated abuse images are not real abuse images.” I agree with that.

            But at face value, “Generated abuse images are not abuse images” is incorrect.