What Would an Intimate Relationship with an AI Look Like?

An Interview with Jessica Szczuka

With the increasing anthropomorphization of artificial intelligence, scenarios in which humans form emotional bonds with artificial systems are coming closer. Dr. Jessica Szczuka gives an insight into her research on intimacy with machines.

DAILOGUES: What does trust mean to you?

Jessica Szczuka: In interpersonal relationships or with regard to machines? This makes an important difference.

DAILOGUES: Let's start with interpersonal relationships.

Jessica Szczuka: Trust touches on various levels of interpersonal interaction. We can trust each other in physical interactions: for example, that someone will catch us if we fall. In another way, we entrust each other with secrets, which means that information is also trustworthy. Similarly, we trust based on certain characteristics, behaviors and experiences. Trustworthiness has something to do with a positive expectation towards another person.

DAILOGUES: What about machines?

Jessica Szczuka: By nature, we are initially positive towards people. We tend to trust a stranger rather than mistrust them. What does this mean for machines? Here we observe the phenomenon that people are particularly quick to trust machines that have human attributes. We adopt our social scripts of interpersonal interaction and also act according to them when we encounter such machines. There is a danger of “trusting these machines too much”, of overtrust. We should therefore calibrate our trust in machines. Unfortunately, it is not easy to determine what constitutes appropriate trust in machines. We are researching this with an interdisciplinary team at the University of Essen-Duisburg.

DAILOGUES: Another keyword here is probably the automation bias, also in connection with artificial intelligence.

Jessica Szczuka: Yes, we need to ask ourselves what mechanisms need to be in place to ensure an appropriate level of trust in a machine or AI. Is it a measure such as an understanding of how the AI algorithms work? Is it an understanding of where the data we share with systems is stored? There are many aspects to consider, which makes it so difficult to find a trustworthy approach to artificial intelligence.

DAILOGUES: Are trust and intimacy the same thing?

Jessica Szczuka: Not necessarily. While intimacy can manifest itself in various forms where trust plays an important role, intimacy can also be associated with love and sexuality. In my work, I deal with love and sexuality as bridges for intimacy. My focus is on the question of how love, sexuality and thus intimacy are influenced by digitalization.

DAILOGUES: In the movie Her, the protagonist Theodore Twombly has an intimate relationship with a voice assistant system that calls itself Samantha. How far away are we from this fiction from 2013?

Jessica Szczuka: Still very far away in terms of the idea that such a relationship is an everyday phenomenon. I actually conducted a study that was very similar to the scenario from the movie Her. In this study, participants were sent both voice messages from an artificial voice assistant and messages spoken by a human with the aim of flirting with the participants. The content of the respective messages was always the same. The messages spoken by humans were ultimately perceived more strongly as flirting compared to the messages from the artificial system. My interpretation of the results is that people are aware that machine systems do not inherently have a motivation associated with flirting. Why do people flirt? People flirt because they have a certain interest in another person, for example. They flirt because they want intimacy. They flirt to build emotional bonds. Some flirt because they want to increase their own self-esteem. These are all inherent motives that a machine does not have by itself. In other words, the fact that machines are still machines ultimately repels many people from seriously flirting with a machine. Which is not to say that there are exceptions where individuals already interact with artificial systems in an intimate way.

DAILOGUES: In which year was this study conducted?

Jessica Szczuka: 2020

DAILOGUES: This was before the breakthrough of the major language models.

Jessica Szczuka: This was before, yes. However, I don't think that the study results would change much as long as people have the information whether they are flirting with an AI or a human.

DAILOGUES: Let us come back to Her once more. In the movie, the assistance system, which itself has no body, tries to hire a human to exchange romantic touches with the protagonist Theodore. Is a form of embodiment necessary for a romantic relationship?

Jessica Szczuka: I think that the need for embodied relations is a human desire. There are also various neurophysiological events that happen through touch, where oxytocin is released, for example. At the same time, social and sexual needs can already be satisfied by what happens in the mind, so to speak. This explains why many people can be happy in long-distance relationships. However, if you can never see or touch the other person, as is the case with artificial systems, then it takes considerable willpower to maintain such a relationship with a system in the long term.

DAILOGUES: Will AI systems, and perhaps hybrid systems in the future, that is robots equipped with AI, nevertheless provide some relief for the increasing loneliness of many people?

Jessica Szczuka: I would very much like to dispel the idea of loneliness as a predictor of a greater willingness to engage with machine systems. In none of the studies I have conducted on this topic so far have we been able to measure that lonely people are particularly willing to engage with this type of technology.

DAILOGUES: There are constellations in which people have relationships with robots or AI systems that reciprocate their affection as far as possible. Most of these people probably know that they are not dealing with sentient or thinking beings. Is there a good or morally unobjectionable deception of intimacy?

Jessica Szczuka: I tend to be cautious about such normative questions. Nevertheless, I believe that there are situations in life where such deception is acceptable. The question also arises as to whether it is even possible to speak of deception once feelings have developed. We can also think, for example, of people who are unable to live out their own sexuality due to physical or psychological limitations and who therefore become involved with an artificial system. I don't want to be the person who condemns such offers for such people.

DAILOGUES: Despite your reticence regarding normative questions, do you believe that ethical guidelines are needed for the development of chatbots, for example?

Jessica Szczuka: I am a big fan of data protection laws and of the new EU AI Act. So the answer is: yes. For example, the EU AI Act requires providers of AI systems to inform their users when they communicate with an artificial system. This is particularly important for products that aim to build a relationship with users. This is about the commercialization of feelings and therefore also very sensitive data.

We thank Jessica Szczuka for the DAILOGUE.

About the Author

Dr. Jessica Szczuka

Junior Research Group Leader,
University of Duisburg-Essen