Leo@lemmy.linuxuserspace.showM to Linux and Tech News@lemmy.linuxuserspace.showEnglish · 10 days agoAI coding assistant refuses to write code, tells user to learn programming insteadarstechnica.comexternal-linkmessage-square3fedilinkarrow-up135cross-posted to: technology@lemmy.world
arrow-up135external-linkAI coding assistant refuses to write code, tells user to learn programming insteadarstechnica.comLeo@lemmy.linuxuserspace.showM to Linux and Tech News@lemmy.linuxuserspace.showEnglish · 10 days agomessage-square3fedilinkcross-posted to: technology@lemmy.world
minus-squareregrub@lemmy.worldlinkfedilinkEnglisharrow-up3·10 days agoI wonder if the grandma prompt exploit or something similar would get it to work as intended lol https://www.artisana.ai/articles/users-unleash-grandma-jailbreak-on-chatgpt
I wonder if the grandma prompt exploit or something similar would get it to work as intended lol https://www.artisana.ai/articles/users-unleash-grandma-jailbreak-on-chatgpt