shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 4 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25fedilinkarrow-up1299
arrow-up1299external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 4 months agomessage-square25fedilink
minus-squarePope-King Joe@lemmy.worldlinkfedilinkEnglisharrow-up9·4 months agoDrugs. Mostly. Probably.
Drugs. Mostly. Probably.