- cross-posted to:
- technology@beehaw.org
- kemper_loves_you@lemmy.dbzer0.com
- cross-posted to:
- technology@beehaw.org
- kemper_loves_you@lemmy.dbzer0.com
cross-posted from: https://lemmy.dbzer0.com/post/43566349
cross-posted from: https://lemmy.dbzer0.com/post/43566349
This is a rather terrifying take. Particularly when combined with the earlier passage about the man who claimed that “AI helped him recover a repressed memory of a babysitter trying to drown him as a toddler.” Therapists have to be very careful because human memory is very plastic. It’s very easy to alter a memory, in fact, every time you remember something, you alter it just a little bit. Under questioning by an authority figure, such as a therapist or a policeman if you were a witness to a crime, these alterations can be dramatic. This was a really big problem in the '80s and '90s.
The idea that ChatBots are not only capable of this, but that they are currently manipulating people into believing they have recovered repressed memories of brutalization is actually at least as terrifying to me as it convincing people that they are holy prophets.
Edited for clarity
GPT4o was a little too supportive… I think they took it down already
Yikes!
4o, in its current version, is a fucking sycophant. For me, it’s annoying. For the person from that screenshot, its dangerous.
JFC.