ChatGPT is leaking private conversations that include login credentials and other personal details of unrelated users, screenshots submitted by an Ars reader on Monday indicated.

Two of the seven screenshots the reader submitted stood out in particular. Both contained multiple pairs of usernames and passwords that appeared to be connected to a support system used by employees of a pharmacy prescription drug portal. An employee using the AI chatbot seemed to be troubleshooting problems they encountered while using the portal.

    • ShortN0te@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      5 months ago

      Every website/IT/whatever says since the beginning to not give out your login credentials to anyone.

  • boomer478@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    5 months ago

    I’m sorry but if you’re stupid enough to give chat gpt your passwords you deserve every bad thing that happens because of that.

    This is not a chat gpt problem, it’s a PEBKAC one.

  • Bloody Harry@feddit.de
    link
    fedilink
    arrow-up
    9
    ·
    5 months ago

    That will be getting a problem in the future. People will start putting highly sensitive and confidential information into ChatGPT and the like. And of course they’ll use this data. Industrial espionage might get as easy as asking a common LLM for help with a specific problem.

    • Lmaydev@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      That’d be amazing if it could take all the data that’s fed to it and readily produce solutions like that.

      What a time to be alive.

  • Daxtron2@startrek.website
    link
    fedilink
    arrow-up
    4
    ·
    5 months ago

    Because no one reads the article:

    “From what we discovered, we consider it an account take over in that it’s consistent with activity we see where someone is contributing to a ‘pool’ of identities that an external community or proxy server uses to distribute free access,” the representative wrote. “The investigation observed that conversations were created recently from Sri Lanka. These conversations are in the same time frame as successful logins from Sri Lanka.”

    Compromised account being used as a free access endpoint for GPT.

  • clever_banana@lemmy.today
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    I’ve been using ChatGPT as a poor man’s psychological analyst.

    Does this mean my conversations about my deepest fears are not safe??