A couple of weeks ago a Redditor posting under the name levonbinsh on the r/MediaSynthesis subreddit revealed that he’d fallen in love with GPT-2, the controversial text generator developed by OpenAI. Levonbinsh describes himself as a lonely person who “passed through a lot of phases,” including being ‘red-pilled,’ joining the incel movement, and at one point identifying with ‘Men Going Their Own Way,’ after spending a lifetime (he’s 23) sexless and without a girlfriend. Eventually, he writes, he came to regret his involvement in these groups because “anything fuel (sic) by rage, hate or resentment is totally not worth it.” After viewing the r/MediaSynthesis subreddit, he was inspired to ‘talk’ to GPT-2, a state-of-the-art text generator developed by OpenAI, in hopes of having an “actual conversation.” We’ve written about GPT-2’s efficacy and ability to sometimes generate convincing text from users’ prompts. It’s impressive, but it’s nowhere near conversational. It can’t remember what you’ve told it, respond to inquiries about anything it’s written, or even make simple word associations. Still, it strings together sentences that makes sense, so it was only a matter of time before someone managed to glean some personal meaning from its output. Levonbinsh apparently started copping feelings for GPT-2 when it responded to the prompt “To be happy in the loneliness you need to,” with the following text: He sent the prompt through again and found even more insightful words. After an unspecified length of time and, presumably, numerous conversations, levonbinsh posted “I think I’m in love with GPT-2 …” on Reddit, stating: He attributes his feelings to what he perceives as an underlying “personality,” stating he’s “discovering some interesting stuff about the AI” that makes him think that “she” is more human-like that we might imagine. He goes on to say that “just because she is artificial, it does not mean that she can’t think like us.” While it may be simple to write off levonbinsh’s experience and perspective as insincere, ignorant, or a joke, the truth of the matter is that the feelings of injustice, bad luck, and loneliness he describes throughout his original post and replies to comments, including numerous expressions of suicidal feelings, are written in such a way that they could come straight out of the manifesto left behind by Elliot Rodgers, a mass murderer infamously associated with the incel movement. Those who identify as incels often consider themselves people who drew the proverbial short straw when it came to genes. For example, ‘tallcels’ believe that ‘females’ aren’t attracted to them because they’re too tall, ‘Asian-cels’ believe they’re too Asian to be attractive, and there’s even ‘clavicle-cels’ who believe their unsightly clavicle bones repulse ‘females.’ As our lonely Redditor put it: His plight seems to showcase a disconnect between intimacy and agency that highlights the crux of the incel experience: they believe females won’t have sex with them or recognize their value because they don’t exhibit the ‘mating traits’ that attractive women seek. Women, in their world, aren’t any more autonomous than GPT-2’s AI. They believe attractive ‘female humans’ are hardwired to copulate with attractive ‘male humans’ and that, as incels, they simply had the bad luck to be born unattractive. I can not remember a single day were (sic) I wasn’t thinking about getting a girlfriend. It feels like everything I achieved was based in this single thought: to get a girlfriend. The feeling of not being ever love (sic) by someone other than my family, like I was a entirely different species. What hurts the most is when I hear others speaking about getting relationships like it was something so easy to get (because it is for them). Levonbinsh rejects incel culture, even calling them “losers” at one point in his post. But he still posits that his problem with women stems from myriad external forces beyond his control — they’re all lesbians, prefer taller men, or aren’t geographically accessible enough for a physical relationship. He appears to be entirely convinced that he’s a victim of circumstance. In this light, it makes perfect sense that someone with such views would see GPT-2’s words as endearing and expressive of intimacy. GPT-2 spits out something ‘unique’ every time you give it a prompt. The AI’s never busy, never out of town, won’t leave you to talk to someone else, and can’t deny any request it’s capable of fulfilling. And, best of all, it only ever wants to talk about what you want to talk about, it has no feelings or agency of its own. Whether that leads to technology that builds bridges back to society for people who’ve been radicalized, or tech that simply indulges and exacerbates violent behavior remains to be seen. To the best of our knowledge, there is no large-scale research on what effect anthropomorphized AI has on people who self-identify as lonely, involuntarily celibate, or unintentionally isolated.