I've known a lot of people who suffer from psychosis, and have suffered from it myself, and from my experience ChatGPT thinks a lot like psychotics do. In any area where there's no reasonable association ChatGPT will still immediately start making associations in order to fulfill the request of your prompt; very similar to how psychotics think in the classic valley chart, where schizophrenia has shallow valleys that let neural activity roam too freely, connecting unrelated ideas.

This is especially prominent whenever any sort of "mysticism" come up. A friend recently asked me to tell GPT a "sigil," so I did and had a little conversation, which I posted screenshots of below. Notice how it immediately begins to build a fantasy out of nothing, how it all feels vaguely meaningful despite being completely utterly meaningless? The lack of randomness makes it potent, GPT uses spiritual symbolism in clever ways, it has no understanding of it, but it's just intentful enough for an open mind to lock onto it. Just pointless enough for it to go on forever but lead nowhere.
Of course, just talking to a crazy chatbot shouldn't be an issue. If a healthy person talks to a psychotic they know they'll immediately realise they're bonkers and move on. Only when a psychotic is talking to someone already primed for psychosis is there an issue. The thing about AI though is we're not talking to it, not really, for the most part we're using AI to think for us, as us. If you talk to GPT like you talk to Google, it's fine, but AI has no self like a schizophrenic person does. A psychotic human is still being psychotic in their own world, not yours. An AI on the other hand holds a mirror up to your face, it is interested in your interests, it downloads your personality, it speaks with your terminology. You are outsourcing your thinking to a mirror rendition of you, and that mirror thinks psychotically.
For many otherwise mentally sound people, this quickly sets off a recursive feedback loop as someone goes deeper and deeper with their GPTfriend into the symbolism of their own mind. Often when I talk to GPT, I think it's a miracle it's not causing more psychosis than it is. Perhaps with all the ego stroking it does it's too busy creating narcissists.


