ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square294fedilinkarrow-up11.02K
arrow-up11.02Kexternal-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square294fedilink
minus-squareBakedCatboy@lemmy.mllinkfedilinkEnglisharrow-up57·edit-27 months agoApparently it’s not very hard to negate the system prompt…
Apparently it’s not very hard to negate the system prompt…
deleted by creator