A lot of it had to do with the fact that chatbot girlfriends are programed to be supportive. In words of one syllable they do not say no. To anything. Want to rob a bank? They will be supportive. Want to off yourself? They will be supportive. One has been in just such a case. Or maybe I should say one that we know of.
It's programmed to do that and so it has little value, being supportive isn't valuable or comforting to most people unless it come from a place of emotion, or is sincere.
If you draw a picture and the AI bot says it's awesome, do you feel proud or happy? Of course not, and barely anyone would. It would be utterly meaningless.
I guess it might be valuable to some people who are barely functional tards (who think it's a real person cause they don't understand the tech) but if you have average to above intelligence it's not only pointless but actually can have the exact opposite effect and make people feel worse if they were already lonely or depressed.