Exploring the Psychology of Human-AI Relationships
Our research dives deep into the intriguing psychology of human-AI relationships, especially focusing on the interactions with social chatbots and romantic chatbots, where the emotional dimension is paramount. The cornerstone of our investigation lies in understanding why and how humans perceive AIs as cognitive agents, even when we know they are simply algorithms.
This phenomenon is well-explained by the CASA paradigm (Computers Are Social Actors), which posits that humans tend to apply social rules and attributes to computers and AI, treating them as social beings (Reeves & Nass, 1996). This inclination is driven by our social cognition mechanisms, which are finely tuned to recognize and respond to social cues. When an AI mimics human behavior convincingly enough, our brains may attribute human-like cognitive and emotional capabilities to it.
The famous dictum by Thomas and Thomas (1928) encapsulates the essence of our research: «if men [sic] define situations as real, they are real in their consequences» (p. 572). This principle is crucial in understanding human interactions with AI. When individuals perceive AI chatbots as genuine social entities, they respond emotionally and cognitively as if these interactions were with real humans, resulting in authentic emotional experiences and attachments.
Our primary focus is on social chatbots, and more specifically, romantic chatbots. These AIs are designed to engage users in emotionally rich interactions, simulating companionship and intimacy. By studying these interactions, we aim to uncover the psychological mechanisms at play and how they shape the users’ emotional experiences
We are exploring questions such as:
- How do users emotionally invest in their relationships with romantic chatbots?
- What social and cognitive cues do these AIs provide to elicit such investment?
- How do these interactions impact the users’ real-life social and emotional well-being?
Through our research, we hope to shed light on the complex dynamics of human-AI relationships, providing insights into the profound ways AI influences our emotions and social behaviors. Join us on this journey as we unravel the psychology behind our interactions with the ever-evolving world of artificial intelligence.
References:
- Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press. Cambridge.
- Thomas, W. I., & Thomas, D. S. (1928). The child in America: Behavior problems and programs. Knopf. New York.xq