Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.

  • Daxtron2
    link
    fedilink
    arrow-up
    8
    ·
    4 months ago

    It is a pretty silly scenario lol, I personally don’t really care but I can understand why they implemented the safeguard but also why it’s overly aggressive and needs to be tuned more.

    • RandoCalrandian@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      4 months ago

      Generating images of black women when a white man was asked for isn’t just silly, it exposes how the outputs of these systems can be covertly directed in specific directions. We just got lucky that they fucked it up badly enough for people to prove it.

      You might be fine with an AI asserting at all times and in every case that abortion is acceptable, but if you did I’m sure you’d be quite upset at a competing AI being setup and covertly managed in the same way to assert abortion is always murder, and to manipulate its user into believing so, up to and including any images being requested including abortion is murder protestors, or the subject being splashed in blood or whatever.

      That’s one highly charged political point. All of them are on the table with this, especially as people start using it more than a regular search for basic information.

      People only think it’s silly and don’t care because the demonstrated and injected bias seems well intentioned this time.

      AI’s being directed this way, and specifically covertly in a way we can’t audit or detect, is going to be a massive problem going forward.

      (Edit: that’s not even including some really nefarious shit, like repeatedly training it on the Bible and religious texts, directing it to never explicitly mention god or religion, but ensure all its outputs fall in line with religious indoctrination)

      • Daxtron2
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        Reason #876 why open source models are better

      • Kichae@lemmy.ca
        link
        fedilink
        English
        arrow-up
        15
        ·
        4 months ago

        If you create an image generator that always returns clean cut white men whenever you ask it to produce a “doctor” or a “business man”, but only ever spits out black when when you ask for a picture of someone cleaning, your PR department is going to have a bad time.

        • RandoCalrandian@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          4 months ago

          Doing the opposite of that isn’t any better

          And Gemini was perfectly happy to exclude blacks from prompts about eating fried chicken and watermelon.

          Turns out you can’t fight every fire with more fire, more often than not it will burn everything down. You can’t solve something as complex as systemic racism with more systemic racism just against a different group.

      • entropicdrift@lemmy.sdf.org
        link
        fedilink
        arrow-up
        4
        ·
        4 months ago

        Corporations making AI tools available to the general public are under a ton of scrutiny right now and are kinda in a “damned if you do, damned if you don’t” situation. At the other extreme, if they completely uncensored it, the big controversial story would be that pedophiles are generating images of child porn or some other equally heinous shit.

        These are the inevitable growing pains of a new industry with a ton of hype and PR behind it.

        • maynarkh@feddit.nl
          link
          fedilink
          arrow-up
          7
          ·
          4 months ago

          TBH it’s just a byproduct of the “everything is a service, nothing is a product” age of the industry. Google is responsible for what random people do with their products.