User shocked to find chats naming unpublished research papers, and other private data.

  • @pedestrian@links.hackliberty.org
    link
    fedilink
    English
    145 months ago

    ChatGPT user Chase Whiteside noticed that his account history contained private conversations that were not his own. These included login credentials and details from a pharmacy employee troubleshooting an application. OpenAI investigated and believes Whiteside’s account was compromised by an external group accessing a pool of identities. This underscores the lack of security features on ChatGPT like two-factor authentication. Previous incidents have shown ChatGPT can also divulge private information if included in its training data. An interesting aspect was the candid language used by the pharmacy employee to express frustration with the poor security of the application they were troubleshooting. This highlighted the risk of including private details in conversations with AI systems.

    • @kautau@lemmy.world
      link
      fedilink
      English
      145 months ago

      This reads like “hey chatGPT, write a fictional paragraph about how Chase Whiteside’s OpenAI account was breached to glean private pharmaceutical data from ChatGPT”

    • @TechLich@lemmy.world
      link
      fedilink
      English
      55 months ago

      They’re not files, it’s just leaking other people’s conversations through a history bug. Accidentally putting person A’s “can you help me write my research paper/IT ticket/script” conversation into person B’s chat history.

      Super shitty but not an uncommon kind of bug. Often either a nasty caching issue or screwing up identities for people sharing IPs or similar.

      It’s bad but it’s “some programmer makes understandable mistake” bad not “evil company steals private information without consent and sends it to others for profit” kind of bad.

      • @JohnnyCanuck@lemmy.ca
        link
        fedilink
        English
        45 months ago

        The article (and title) update are saying ChatGPT is claiming its not a bug (as you described), but instead the user’s account was compromised and someone else was using his account to have the chats.

        • snooggums
          link
          fedilink
          15 months ago

          While that is technically possible, I don’t believe ChatGPT.

    • @Daxtron2
      link
      English
      35 months ago

      Literally read the article

        • @Daxtron2
          link
          English
          25 months ago

          Now son, you know your education is important and I just want what’s best for you.

            • @Daxtron2
              link
              English
              25 months ago

              Well that’s because you’re always on that damn Gameboy playing your pokemans, now go do your homework or I’m deleting your pichachu