Saturday, April 25, 2009

Love, Sex and Artificial Intelligence

by Thomas "cmdln" Gideon

I love my computer. I'll admit it. Does that mean I have a deep, meaningful relationship with it? Not so much. I like to think that I've gotten beyond a mere techno fetish into deeper considerations of how programmable, general-purposes computers are able to help us create beauty, discover meaning, and effect change.

Sure, it started more with a fascination for form and style rather than any sort of substance. My relationship with my computer, and computing in general, has taken time to evolve.

I guess it isn't so different from the emotional relationships that characterize my rich social life as a human being. For those intriguing similarities, though, the notion of an intimate relationship with any kind of artificial construct still strikes me as preposterous.

Why is that?

Moore's law, which describes the acceleration of raw computing power as a function of transistor density on a chip, has some researchers in machine intelligence drooling over finally achieving comparable raw computing power to that housed in our humble brain pans. Recent specialization in this field of research has shown promise on the necessary software to transform brute gigaflops into something approaching general intelligence. Despite the constant promise of artificial intelligence being just beyond the horizon for the past few decades, it actually does seem like we may hit a tipping point within our lifetime.

I still cannot see having an emotionally fulfilling relationship with a synthetic being. There are more optimistic researchers betting that intentionally and craftily inspiring emotional connections will form a valuable part of the repertoire of human-machine interaction in future systems, computational and robotic.

Donald Norman's latest book, Emotional Design, Why We Love (or Hate) Everyday Things, goes beyond his earlier efforts in understanding the rational basis for design. He explores how emotion can override reason and lead us to making irrational but inwardly satisfying decisions. His work and others in the same vein suggest value in exploiting that phenomenon to ease the frustrations many users encounter in existing hardware and software designs. It is not that emotionally designed products are better but they consciously tug at our inner chords to get us to put up with their other less endearing quirks.

That's a bit cynical but you can see the optimistic scenario easily enough. Couple thoughtful, rational design with compassionate emotional design and the potential boggles the mind. Not only would you get greater effectiveness or productivity, but you'd feel good as you used the tools that made those improvements possible.

MIT has been exploring these threads of social technology for quite a bit longer, most notably with the Kismet project. Little more than a robotic caricature of a face, Kismet and its researchers seek to discover some of the core components of our emotional interactions. To hear the researchers talk about Kismet, the results are surprising and compelling.

When presented with a noisy information channel, the human mind is adept at filling in the blanks. We have apparently evolved considerable neural machinery to pull off this feat. In emotional interactions, we may have similar but less well understood abilities. We want to project and fill in the emotional gaps even in the most rudimentary systems.

More recently, tweenbot explored similar social interactions with an equally minimalistic construct. Norman, Kismet and tweenbot suggest that a simulacrum doesn't have to be pitch perfect for us to form an emotional connection.

Of course, if the appeal is made to even baser instincts, there appears to be even more latitude. Well before the media rich web, enthusiasts of the form flooded Usenet groups with strings of seemingly random characters that with the right arcane invocations could be transformed into prurient images to suit all tastes. At the risk of understating things, technology and the sex industry have a long and storied relationship. Many folks have already suggested that key technology innovations, such as the DVD format and high quality video codecs for online distribution, are the direct result of our monkey sex drives.

Sex and technology is a whole other topic to explore. No doubt there is plenty of research comparable to the emotional technology writings and projects I have cited. We, as a species, don't seem to have a problem with emotionally connecting with our technology nor do we collectively blink an eyelash at its increasing role in the development of our sexual natures.

I remain skeptical and my objection really crystallized when Helen Madden expressed a simple idea on a panel on which we both participated at a recent science fiction convention.

What if your sex toy could say no?

It would be easy to devolve from that simple question into some pretty heavy and potentially disturbing psycho-analysis. Or to be flip and dismissive. At that moment, in the context of a discussion of love, sex and artificial intelligence, it really captured a latent but necessary leg to the tripod of a satisfying relationship. I've discussed emotional connection and intimacy but I think these aren't able to get past technologically-mediated self-gratification without some degree of agency, of free will.

It seems so obvious in retrospect. It also represents a largely unspoken holy grail of artificial intelligence. When discussing our relationships with other social animals, we completely take it for granted. It isn't even worth mentioning.

In the context of a relationship with a constructed being, it is critical because we haven't been able to instill true agency into any of our creations as of yet. We are not even sure how to measure it, to know when we truly have achieved it. However, it is only when our creations are capable of evolving beyond their programming, to follow independently derived desires, to say no to us, that they achieve equal footing with the other social agents available to us. Only when there is the risk of rejection is there a sense of satisfaction in successfully developing a healthy emotional, even intimate, relationship, regardless of whether the agent's programming executes in flesh or in silicon.


Thomas "cmdln" Gideon is a self-described hacker, curmudgeon and hacktivist who ponders the intersection of technology and society on his twice weekly program, The Command Line Podcast, which can be found at A student of The Hacker Ethic, he is particular fascinated by its contentions that computers can be used to create beauty and that they have the potential to effect positive social change. He follows a number of related topics of interest such as the creation and distribution of social media as a form of peer production, the future of computing both as realized in its physical architecture and the ways we program these forthcoming systems, and how computing relates to our own astonishing capacity for reasoning.

His interest in artificial intelligence combined with his habit of speaking at science fiction conventions led to his being a co-panelist with Helen Madden contemplating the intersection of social relationships, intimacy and machine minds.

Creative Commons License

Love, Sex and Artificial Intelligence by Thomas "cmdln" Gideon is licensed under a Creative Commons Attribution 3.0 United States License.

Based on a work at


  1. Greetings, Thomas,

    Welcome to the Grip, and thank you for your thought-provoking post.

    I find that it echoes some of my comments in response to Garce's post on Wednesday. I can't imagine wanting a sex bot that couldn't think for itself and have its own desires and pet peeves. I suppose if AI ever gets to that point, it will be indistinguishable from humanity, and I wouldn't care.

    Meanwhile, Helen's stick figure animations illustrate your other point. Her cartoons are minimally realistic, yet they succeed remarkably well in conveying emotion. We are hardwired to seek emotional responses and to fill in the blanks, as you put it, when the information is incomplete.


  2. Thanks for bringing your thoughts to the Grip, Thomas. I love my computer, too, and sometimes hate it. But that's about the extent of the emotion.

    The 'sex and technology' views this week have been enlightening and fun.

    Take care!


  3. Thanks for coming on our little list. Very inetersting post on an ineteresting subject.

    You say you think its preposterous for people to relate intimately to an artficial construct. I don;t think I agree with that, considering how I've seen people relate to their pets, including non mammals like fish and turtles.

    I think the real problems will begin if and when people do relate to machines, and I predict a lot of the problems will come from religious people. Why does religion hate evolution so much, compared to say, the law of gravity? No fundamentalist of any religion denies the law of gravity. You won;t find anyone who works in virology who denies evolution by natural selction. But the idea takes man, and places him among the animals, rather than at the center of a celestial universe powered by the love of God. This drives religious people crazy, the way Galaileo did when he said the earth is not the divine center of the heavens, it just goes around the sun like every other planet.

    The point that you bring up (Helen said) about a sex toy being able to say no, I think is really interesting. That would add the third dimension. Lisabet said something in response to my post, that sexbots would be appealing to men who are afraid of women, or afraid of rejection because it eliminates the element of independence. That of course is why men patronise prostitutes. Hookers do two things 1) Pay the fee and they don;t reject you. 2) When you're finished they go away. On an emotional level nothing could be more safe and appealing for some men. A sexual Happy Toy. Robots could do that too, but as I explored in a story called "The Doll", that can get dull. A sexbot that can have a headache would be really interesting.


  4. This was so intelligently written that it made my head spin the first time I read it. I really wish I'd had internet connection on Saturday so I could have responded then! But my thoughts on this are that you're right on the nail. People do respond to minimalist recreations of emotion (and research has shown that animals will respond in similar fashion!). We fall in love with anything we can anthropomorphize, be it a doll or cartoon character or a computer simulation. Yet for that emotion to really be love, there has to be the risk of rejection.

    Strangely enough, people already engage in relationships where they feel love for someone but say so and thus never run the risk of being rejected. I wonder if these would be the same people who would buy into having a robot lover, but only up until the point where the robot could say no.

    Thank you again for being our guest this week!


Note: Only a member of this blog may post a comment.