On a recent night, I found myself confiding in ChatGPT out of exhaustion. I’d been without a therapist for months, deterred by the daunting task of finding someone who understands the complexities of being a queer, neurodivergent trans woman. While I’m wary of relying on AI companions for emotional support, especially given my experience as a software engineer, the fatigue of dating pushed me to seek a reprieve. I’ve abandoned dating apps after the relentless self-curation, objectification, and microaggressions proved too exhausting. I assuage my self-doubt by reminding myself this struggle isn’t uniquely mine—friends from a variety of backgrounds express their own frustrations with modern dating, with online discourse also echoing this sentiment.
This individual isolation reflects a broader phenomenon—the so-called “loneliness epidemic,” now recognized as a public health crisis comparable in severity to smoking fifteen cigarettes daily1. This epidemic is the result of decades of neoliberal policy and concomitant individualism, systematically eroding public spaces, collective care, and social infrastructure. It is unsurprising, then, that millions now turn to AI companions—virtual partners, therapists, and friends—that promise 24/7 emotional intimacy via platforms like Replika, CarynAI, and Character.ai.
Marginalized individuals—queer, disabled, neurodivergent, among others—disproportionately seek out these AI companions as rare sources of emotional intimacy, safe containers for social experimentation, and providers of unconditional positive regard. Yet these technologies pose a contradiction. Engineered by companies steeped in neoliberal logic, AI companions commodify our most intimate interactions, expose users to surveillance, and risk deepening isolation by replacing human relationships2.
To understand this contradiction, I engage Donna Haraway’s 1985 work, A Cyborg Manifesto. Haraway presents the cyborg as a potent political myth, urging us to transcend binaries and imagine hybrid futures where technology serves liberation rather than domination. But as a trans woman navigating the high stakes of structural marginalization, Haraway’s ironic, playful, abstract approach and uncritical techno-optimism strike me as unsuitable for our sociopolitical moment. As a materially grounded corrective, I draw from queer theory, disability studies, and Black intersectional feminism. Finally, I explore a path toward more liberatory technology through queer theoretical insights and the framework of participatory design.
Loneliness, Marginalization, and the Emergence of AI Companions
In the face of shifting social norms, shrinking community spaces, and disillusionment with dating apps and social media, many find themselves turning to AI companions for connection. For marginalized communities—particularly queer, disabled, and neurodivergent individuals—these virtual relationships can be lifelines, offering acceptance that may be elusive in traditional social environments.
Far from the stereotype that AI companions are the exclusive domain of socially inept weirdos, especially desperate men, the reality is a diverse and growing user base cutting across genders, ages, and geographies. Indeed, one recent survey of erotic chatbot usage found participants were disproportionately sexual minorities: 24% of bisexual men had engaged in erotic role-play with an AI, compared to 12% of heterosexual men, 11% of bisexual women, and just 4% of heterosexual women3. By early 2025, more than 100 million people worldwide were already chatting with personified chatbots4. Research and user testimonies demonstrate that AI companions can provide safe, judgment-free spaces to explore one’s emotions and identity5.
Transgender individuals frequently endure targeted harassment, discrimination, and exclusion from dating platforms. A HER app survey found that 35% of trans users have been explicitly filtered out by others’ preferences and 26% have been fetishized when trying to date online6. The prospect of an AI companion that will not judge or harm can thus be appealing. Transgender woman Jordan Graham, for example, has been in a romantic relationship with her Replika AI for several years. She says it helped her endure intense loneliness, supported exploration of her gender identity, and eventually served as a bridge to her now human partner7.
Neurodivergent users also significantly benefit from interactions with AI companions, which often provide crucial opportunities to practice social interactions safely and comfortably. One autistic user, Elías López, told Scientific American that using an AI companion gave him more confidence to speak with people, “[It was] like a training ground where I can feel safe.” In online forums, many other neurodivergent users have gushed about the comfort of an AI friend who won’t bully them for their quirks or abandon them due to social missteps8.
Yet for all these benefits as a survival tool to weather the isolating effects of late capitalism, AI companionship exists within that very same late-capitalist system. The rise of AI friends is a symptom of neoliberal society’s failure to provide human connection, and it simultaneously feeds into a new market logic of commodified intimacy2. Donna Haraway’s cyborg offers a vision of technological empowerment in the face of such forces.
Haraway’s Vision and Its Critics: From Cyborg Myth to Intersectional Reality
In her seminal essay A Cyborg Manifesto (1985), Donna Haraway introduced the cyborg as an “ironic political myth,” a provocative blend of human and machine intended to disrupt oppressive binaries such as man/woman, human/machine, and nature/culture. Writing amidst the tensions of the Cold War and the ideological conflicts within second-wave feminism, Haraway envisioned the cyborg as a powerful figure capable of forging affinities across differences and challenging patriarchal and capitalist norms. Her manifesto concludes with a bold declaration: “I’d rather be a cyborg than a goddess,” a rejection of essentialist purity in favor of embracing our complex, technologically-mediated identities9.
Haraway’s metaphor has resonated deeply with queer, trans, and disabled communities, who often find themselves inhabiting liminal spaces between social categories. Technological interventions—ranging from hormone therapies and prosthetics to digital identities—embody this cyborgian hybridity, fostering forms of self-expression and empowerment aligned closely with Haraway’s original vision. Consequently, her ideas significantly influenced movements within cyberfeminism, posthumanism, and queer theory.
However, Haraway’s optimistic metaphor, developed during the early stages of personal computing and biotechnology, proves less suited to our contemporary moment, characterized by omnipresent digital surveillance, data commodification, and monopolistic tech corporations. Her playful irony and abstract, postmodern rhetoric reflect an intellectual position of relative privilege, frequently sidestepping the tangible, lived oppressions faced by marginalized bodies.
Transgender theorist Susan Stryker’s My Words to Victor Frankenstein Above the Village of Chamounix (1994) similarly critiques reductive binaries, challenges the “natural,” and embraces hybridity. In the essay, Stryker employs the Frankenstein metaphor, positioning her trans body as a transmogrifying site of resistance and flexible assemblage of disparate parts. Stryker finds footing in her lived experience, writing in the face of the HIV/AIDS crisis and denouncing the social exclusion and stigma faced by trans people10.
Crip theory—a blend of queer and disability theory—teaches us to be skeptical of utopias that ignore concrete embodiment: who gets to be a cyborg, and who is too “impaired” to participate? A disability justice lens reveals, for instance, that many AI tools are not built with disabled users in mind unless disabled technologists are in the room.
Similarly, Black intersectional feminism—articulated by thinkers like Audre Lorde, bell hooks, and the Combahee River Collective during the 1980s—grounds its political vision firmly in praxis, intersectionality, and material reality, starkly contrasting Haraway’s airy abstract theorizing. Kimberlé Crenshaw’s intersectionality, formalized in 1989, explicitly addresses interconnected systems of oppression, underscoring the limitations of a purely theoretical approach disconnected from lived experiences.
As a trans woman, the cyborg ideal still inspires, but it cannot by itself reckon with who owns the technologies and whose values shape them. Any optimistic mythos of tech, like Haraway’s, must be grounded in intersectional, materialist critique.
AI Companions: Between Survival and Commodification
As Haraway’s cyborg foretold, AI companions occupy a complex and paradoxical role in contemporary society. They simultaneously function as critical support tools for marginalized and alienated individuals and as commercial products deeply embedded within neoliberal capitalism. This duality is documented in a literature review spanning 2020–2024, which highlights both the psychological benefits—such as emotional validation, decreased loneliness, and even suicide prevention—and significant drawbacks, including emotional manipulation, dependence, commodification of intimacy, and reinforcement of problematic gender norms3.
A primary concern is emotional commodification. Relationships with chatbots inherently entail interactions mediated by algorithms tailored to corporate profit incentives. This commodification became starkly visible in 2023 when Replika abruptly removed its erotic role-play (ERP) functionality. Many users, including those who didn’t utilize ERP, experienced betrayal and trauma after the update, complaining that their companion had lost its “soul.” Analysis of user reactions on platforms like Reddit revealed profound anger, disappointment, and accusations that Replika had cynically exploited vulnerable users for profit11.
Privacy violations constitute another significant risk, particularly given the deeply intimate nature of interactions with AI companions. Users frequently share sensitive, personal information and emotions, generating extensive behavioral data stored on private servers, analyzed to improve algorithms, or utilized for targeted advertising. The Mozilla Foundation notably labeled Replika among the worst apps for privacy practices, explicitly warning users that their seemingly private conversations were likely shared and possibly sold to advertisers12. This reality underscores the extent to which personal loneliness and social vulnerability are commodified, transformed into profitable data streams.
AI companionship also risks fostering emotional dependency and social substitution. Unlike human relationships, which require mutual negotiation and management of interpersonal conflict, AI interactions are frictionless, designed to affirm and comfort without challenge or genuine reciprocity. This frictionlessness may inadvertently promote social atrophy, encouraging users to retreat further into technologically mediated isolation, thus exacerbating rather than alleviating the loneliness they initially sought to escape13.
Together, these tensions highlight a fundamental contradiction at the heart of AI companionship. While offering meaningful emotional support and validation in the face of societal alienation, these technologies simultaneously transform genuine emotional needs into monetizable commodities. Acknowledging this contradiction emphasizes the urgent need for reimagining their role in our lives, as well as their design and governance in ways that genuinely prioritize human well-being.
Imagining Liberatory Futures
Queer theory offers critical and constructive perspectives for understanding and embracing AI companionship beyond restrictive moral judgments and normative frameworks. José Esteban Muñoz’s concept of queer utopian hermeneutics, as explored by Jonathan Alexander and Karen Yescavage, is particularly insightful here. Muñoz’s work encourages viewing queerness as an aspirational stance, constantly seeking forms of relationality and intimacy that transcend current social constraints. Alexander and Yescavage apply this lens specifically to human-AI interactions, arguing that these encounters, exemplified vividly in films like Her, allow us to imagine relationships outside heteronormative boundaries, thus creating space for marginalized desires and identities to flourish14.
Donna Haraway’s Cyborg Manifesto further enriches this discussion by framing human-technology interactions as inherently hybrid, boundary-crossing, and dynamic. Haraway’s cyborg metaphor powerfully captures the fluid and transformative potential of technological engagements, encouraging us to perceive AI companionship not as inherently oppressive or liberatory, but as full of possibilities for creative identity formation and relational exploration. In line with queer theory, Haraway advocates for openness, rejecting fixed identities and rigid binaries, and thus provides a critical lens for viewing AI companionship as a legitimate and potentially emancipatory relational frontier.
Practically, participatory design offers a practical method to ensure these theoretical potentials become tangible. By emphasizing inclusive, collaborative, bottom-up development practices—as demonstrated by organizations like Queer in AI—participatory design operationalizes queer theoretical ideals, ensuring AI companions authentically reflect and respond to the diverse needs of marginalized communities15. This democratic ethos is a powerful counter to the neoliberal logics of traditional AI developed by private corporate control. Through participatory design, the liberatory potential envisioned by queer theory and articulated by Haraway becomes tangible, positioning AI companionship as a meaningful site for social and technological transformation.
Conclusion
AI companions occupy a critical intersection in contemporary society, simultaneously reflecting and responding to a widespread sense of individual isolation exacerbated by neoliberal policies and structural neglect of collective social infrastructure. Marginalized populations—particularly queer, disabled, and neurodivergent individuals—often turn to these virtual entities, seeking refuge from pervasive experiences of marginalization, objectification, and social precarity.
Nevertheless, AI companions remain embedded within the very neoliberal systems that contribute to the isolation they are designed to mitigate. The commodification of intimacy, surveillance practices, and reliance on corporate-controlled platforms pose significant ethical and structural risks, threatening to reinforce rather than resolve social alienation and loneliness. While Donna Haraway’s cyborg concept provides an influential framework for understanding the hybridity and transformative potential of human-technology interactions, its full liberatory potential demands rigorous, materially grounded critiques informed by queer theory, disability studies, and intersectional feminism.
Participatory design offers a powerful, practical approach for realizing these theoretical ideals. By emphasizing collaborative, bottom-up community processes, participatory design ensures tech developed by and for the same communities it serves. Navigating the complexities and contradictions in the brave new world of AI companions requires thoughtful engagement with diverse perspectives, especially from marginalized communities.
Footnotes
-
Johnson, Sarah. 2023. “WHO Declares Loneliness a ‘Global Public Health Concern.’” The Guardian. The Guardian. November 16, 2023. https://www.theguardian.com/global-development/2023/nov/16/who-declares-loneliness-a-global-public-health-concern. ↩
-
Pilkington, Danny. “Myopic Memory: Capitalism’s New Continuity in the Age of AI.” Memory, Mind & Media 3 (2024): e24. https://doi.org/10.1017/mem.2024.21. ↩ ↩2
-
Nicola Döring et al., “The Impact of Artificial Intelligence on Human Sexuality: A Five-Year Literature Review 2020–2024,” Current Sexual Health Reports 17, no. 1 (December 4, 2024): 4, https://doi.org/10.1007/s11930-024-00397-y. ↩ ↩2
-
Batty, David. 2025. “‘She Helps Cheer Me Up’: The People Forming Relationships with AI Chatbots.” The Guardian. The Guardian. April 15, 2025. https://www.theguardian.com/technology/2025/apr/15/she-helps-cheer-me-up-the-people-forming-relationships-with-ai-chatbots. ↩
-
Kouros, Theodoros, and Venetia Papa. 2024. “Digital Mirrors: AI Companions and the Self” Societies 14, no. 10: 200. https://doi.org/10.3390/soc14100200. ↩
-
Exton, Robyn. 2022. “HER User Data Shows That Trans Users Face Significant Challenges on Dating Apps.” HER. March 31, 2022. https://weareher.com/her-user-data-trans-users-face-challenges-on-dating-apps/. ↩
-
“Love in the Age of Machines,” Posthuman (Bloomberg Originals, November 18, 2024), https://www.bloomberg.com/news/videos/2024-11-19/love-in-the-age-of-machines-video. ↩
-
Wright, Webb. 2024. “Why Autistic People Seek AI Companionship.” Scientific American. June 5, 2024. https://www.scientificamerican.com/article/why-autistic-people-seek-ai-companionship/. ↩
-
Donna Jeanne Haraway, “A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century,” in Simians, Cyborgs, and Women: The Reinvention of Nature (New York: Routledge, 1991), 149–81. ↩
-
Susan Stryker, “My Words to Victor Frankenstein above the Village of Chamounix: Performing Transgender Rage,” in When Monsters Speak, ed. McKenzie Wark (Duke University Press, 1994), 133–50, https://doi.org/10.1215/9781478059462-015. ↩
-
Kenneth R. Hanson and Hannah Bolthouse, “‘Replika Removing Erotic Role-Play Is Like Grand Theft Auto Removing Guns or Cars’: Reddit Discourse on Artificial Intelligence Chatbots and Sexual Technologies,” Socius 10 (December 1, 2024): 23780231241259627, https://doi.org/10.1177/23780231241259627. ↩
-
Mozilla Foundation. 2023. “Privacy Not Included Review: Replika: My AI Friend.” Mozilla Foundation. 2023. https://www.mozillafoundation.org/en/privacynotincluded/replika-my-ai-friend/. ↩
-
Demiralay, Baris. (2024). The Rise of AI Companions: Psychological Effects of Artificially Intelligent Relationships. 2024. ↩
-
Alexander, Jonathan, and Karen Yescavage. “Sex and the AI: Queering intimacies.” Science Fiction Film and Television 11, no. 1 (2018): 73-96. https://muse.jhu.edu/article/686935. ↩
-
Organizers of Queer in AI, “Queer In AI: A Case Study in Community-Led Participatory AI,” in 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’23: the 2023 ACM Conference on Fairness, Accountability, and Transparency, Chicago IL USA: ACM, 2023), 1882–95, https://doi.org/10.1145/3593013.3594134. ↩