@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 2 years agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square265fedilinkarrow-up1908arrow-down119
arrow-up1889arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@misk@sopuli.xyz to Technology@lemmy.worldEnglish • 2 years agomessage-square265fedilink
minus-square@Mahlzeit@feddit.delinkfedilinkEnglish1•2 years agoOh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought. I did say “making available” to exclude “hacking”.
minus-squareJackbyDevlinkfedilinkEnglish1•2 years agoThe point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.
Oh. I see. The attempts to extract training data from ChatGPT may be criminal under the CFAA. Not a happy thought.
I did say “making available” to exclude “hacking”.
The point I’m illustrating is that plenty of things reasonable people would assume are fine the law can call hacking.