Pak ‘n’ Save

  • FuckyWucky [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    1 year ago

    based AI

    When customers began experimenting with entering a wider range of household shopping list items into the app, however, it began to make even less appealing recommendations. One recipe it dubbed “aromatic water mix” would create chlorine gas. The bot recommends the recipe as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses”.

    Yim yum

    “Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

  • Xcf456@lemmy.nz
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    Still better than trying to do the weekly pak n save shop on a Saturday afternoon

  • Ilovethebomb@lemmy.nz
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    1 year ago

    I can’t wait for AI to backfire in novel and unforseen ways, until people get bored and shut the fuck up about it.

  • DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Sounds like the “AI” just mashes ingredients together that people input, so people input cleaning products instead of food and it just does what it does with it. I am at least not aware of supermarket food products that you can buy and mix that would create chlorine gas.

    • federalreverse-old@feddit.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      As the article states, initially the meal planner allowed adding dangerous ingredients.

      The site actually uses ChatGPT 3.5 but you can’t freely edit the prompt, instead you can only enter ingredients which are then added to a prompt template. They seem to be using a list of approved/rejected ingredients now, however, I tried adding household cleaner and glyphosat and both were rejected.

  • Fizz@lemmy.nz
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    This doesn’t surprise me considering that tricking people into making chlorine gas was an old internet meme.