@LainTrain@lemmy.dbzer0.com to memes@lemmy.world • 11 months agoBlursed Botlemmy.dbzer0.comimagemessage-square73fedilinkarrow-up1928arrow-down118
arrow-up1910arrow-down1imageBlursed Botlemmy.dbzer0.com@LainTrain@lemmy.dbzer0.com to memes@lemmy.world • 11 months agomessage-square73fedilink
minus-square@nondescripthandle@lemmy.dbzer0.comlinkfedilink-6•edit-211 months agoInput sanitation has been a thing for as long as SQL injection attacks have been. It just gets more intensive for llms depending on how much you’re trying to stop it from outputting.
minus-square@InAbsentia@lemmy.worldlinkfedilink9•11 months agoI won’t reiterate the other reply but add onto that sanitizing the input removes the thing they’re aiming for, a human like response.
Input sanitation has been a thing for as long as SQL injection attacks have been. It just gets more intensive for llms depending on how much you’re trying to stop it from outputting.
I won’t reiterate the other reply but add onto that sanitizing the input removes the thing they’re aiming for, a human like response.