return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 5 days agoAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comexternal-linkmessage-square40fedilinkarrow-up1249arrow-down15
arrow-up1244arrow-down1external-linkAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 5 days agomessage-square40fedilink
minus-square𝓹𝓻𝓲𝓷𝓬𝓮𝓼𝓼@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up24·5 days agodoesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought) any money says they’re vulnerable to prompt injection in the comments and posts of the site
minus-squareBradleyUffner@lemmy.worldlinkfedilinkEnglisharrow-up19·5 days agoThere is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
minus-squareToTheGraveMyLove@sh.itjust.workslinkfedilinkEnglisharrow-up4·4 days agoGood god, I didn’t even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.
doesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought)
any money says they’re vulnerable to prompt injection in the comments and posts of the site
There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
Good god, I didn’t even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.