return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comexternal-linkmessage-square71linkfedilinkarrow-up1311arrow-down15
arrow-up1306arrow-down1external-linkAI agents now have their own Reddit-style social network, and it's getting weird fastarstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square71linkfedilink
minus-square𝓹𝓻𝓲𝓷𝓬𝓮𝓼𝓼@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up26·1 month agodoesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought) any money says they’re vulnerable to prompt injection in the comments and posts of the site
minus-squareBradleyUffner@lemmy.worldlinkfedilinkEnglisharrow-up21·1 month agoThere is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
minus-squareToTheGraveMyLove@sh.itjust.workslinkfedilinkEnglisharrow-up5·1 month agoGood god, I didn’t even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.
doesn’t even have to be the site owner poisoning the tool instructions (though that’s a fun-in-a-terrifying-way thought)
any money says they’re vulnerable to prompt injection in the comments and posts of the site
There is no way to prevent prompt injection as long as there is no distinction between the data channel and the command channel.
Good god, I didn’t even think about that, but yeah, that makes total sense. Good god, people are beyond stupid.