Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 11 months agoTime Bandit ChatGPT jailbreak bypasses safeguards on sensitive topicswww.bleepingcomputer.comexternal-linkmessage-square5fedilinkarrow-up132arrow-down10
arrow-up132arrow-down1external-linkTime Bandit ChatGPT jailbreak bypasses safeguards on sensitive topicswww.bleepingcomputer.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 11 months agomessage-square5fedilink
minus-squareandrewth09@lemmy.worldlinkfedilinkEnglisharrow-up3·11 months agoI don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.
I don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.