@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 2 years agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square143fedilinkarrow-up1733arrow-down114
arrow-up1719arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 2 years agomessage-square143fedilink
minus-squareLazaroFilmlinkfedilinkEnglish15•2 years agoCan’t they have a layer screening prompts before sending it to their model?
minus-squareThrowawaylinkfedilinkEnglish17•2 years agoYeah, but it turns into a Scunthorpe problem There’s always some new way to break it.
minus-square@anteaters@feddit.delinkfedilinkEnglish5•2 years agoThey’ll need another AI to screen what you tell the original AI. And at some point they will need another AI that protects the guardian AI form malicious input.
Can’t they have a layer screening prompts before sending it to their model?
Yeah, but it turns into a Scunthorpe problem
There’s always some new way to break it.
They’ll need another AI to screen what you tell the original AI. And at some point they will need another AI that protects the guardian AI form malicious input.
It’s AI all the way down