@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square143fedilinkarrow-up1733arrow-down114
arrow-up1719arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 1 year agomessage-square143fedilink
minus-squareLazaroFilmlinkfedilinkEnglish15•1 year agoCan’t they have a layer screening prompts before sending it to their model?
minus-squareThrowawaylinkfedilinkEnglish17•1 year agoYeah, but it turns into a Scunthorpe problem There’s always some new way to break it.
minus-square@anteaters@feddit.delinkfedilinkEnglish5•1 year agoThey’ll need another AI to screen what you tell the original AI. And at some point they will need another AI that protects the guardian AI form malicious input.
Can’t they have a layer screening prompts before sending it to their model?
Yeah, but it turns into a Scunthorpe problem
There’s always some new way to break it.
They’ll need another AI to screen what you tell the original AI. And at some point they will need another AI that protects the guardian AI form malicious input.
It’s AI all the way down