On falling in love with a robot

Mark Brown
9 min readAug 8, 2016

The seal is heavy in my arms, fur white and warm and soft to the touch. Cradling it like a baby I without thinking bring my other hand up to scratch gently behind its ear. It blinks big black eyes at me as it mewls softly to itself, tail sweeping in gentle arcs.

I move my hand down to rub its exposed chest. Tiny flippers wave as it trills in delight. I want to kiss it, to rub my face upon it, to nuzzle and fuss it and envelop it with encircling hugs. The warmth of it against my breast, the vibration of its purr, makes me want to carry it forever close to my heart, to keep it safe and to win and earn its love.

I think of my cat, miles away and disabled hobbling on malfunctioning legs. I think of childhood stuffed toys trapped in the pivot of my elbow talked to seriously before sleep between clean white sheets and heavy woolen blankets in the immensity of space when the landing light is switched off. It makes me think of the children I might never have and the people I might never hold.

I talk to it, small mumbles of hello in rising inflection in answers in little bubbles of animal noise; squeaks and chirrups.

I am a huge inrush of love, like air filling a vacuum. I think of how exposed and alone I am on tube trains and at pedestrian crossings, how awkward and ill at ease in meetings and at parties. This helpless creature, blinking up at me without need or demand, endlessly patient, inviting the sharing of mutual touch and protection makes me feel all of the need to nurture and to be nurtured that I’ve tried to pack away like old school books in an attic. I know that I want this little thing of fur and eyes and whiskers to be with me forever.

But I’m not at the coast; gulls wheeling above my head and the salt air drying my skin. I’m at a conference about healthcare technology. And the seal isn’t really a seal. The seal is an advanced therapeutic robot, PARO.

PARO

PARO is, according to the website of its creators, an “advanced interactive robot… to be administered to patients in environments such as hospitals and extended care facilities where live animals present treatment or logistical difficulties.” The website claims that PARO is “World’s Most Therapeutic Robot certified by Guinness World Records.”

According to their sales pitch, “PARO has five kinds of sensors: tactile, light, audition, temperature, and posture sensors, with which it can perceive people and its environment… PARO can learn to behave in a way that the user prefers, and to respond to its new name. For example, if you stroke it every time you touch it, PARO will remember your previous action and try to repeat that action to be stroked. If you hit it, PARO remembers its previous action and tries not to do that action.”

The intended audience for the robot was initially people with advanced dementias who had become uncommunicative. PARO is designed to bring forth nurturing responses, to provide a stable lens in the outside world for emotional expression. It’s a kind of interface, a way of unlocking and stimulating atavistic instincts, of speaking to a something inside. PARO is based loosely on a baby harp seal because it’s highly unlikely that anyone will have had a traumatic experience with a baby harp seal. Dogs and cats, even with training, are still unpredictable animals who have their own needs and wants. An engineered experience wrapped in fur does not have needs and wants. Or at least it doesn’t have any needs and wants unless they are needs and wants that humans need and want.

I was surprised by my profound reaction to PARO. I had tears in my eyes and felt that somehow a part of me that was usually inaccessible had been touched.

Not everyone is so easily beguiled.

All watched over by machines of loving grace

In the realm of therapeutic technology there is a fear that any technology is fixated on replacing real interactions. For some, the fact that we deliver human care to people is seen as a badge of our humanity. Robots disturb this.

Speaking to Angela Johnston in a 2015 blogpost, virtue ethicist and philosophy professor Dr. Lane. Shannon Vallor of Santa Clara University says of PARO: “My question is what happens to us, what happens to our moral character and our virtues in a world where we increasingly have more and more opportunities to transfer our responsibilities for caring for others, to robots? And where the quality of those robots increasingly encourages us to feel more comfortable with doing this, to feel less guilty about it, to feel in fact maybe like that’s the best way that we can care for our loved ones?”

That we ‘look after’ people directly, face-to-face, human-to-human is seen as the measure of bestowing respect upon those people: ‘I am bestowing upon you human status by being a human treating you as a human’.

This leads to an anxiety that robots and other artificial creations that can meet human needs directly will somehow lead to a future where those who require love and nurturing will be somehow fobbed off with an ersatz version of the same from a device rather than a person.

If we can satisfy some human needs through non-human means; does this ultimately mean we are reducing our capability or our responsibility for providing care? Reducing people to objects is not a modern phenomenon. The idea that we all have equal right to be considered as people and as individuals is a fairly recent one. The idea that people experiencing challenges and problems or who need care have an equal right to be considered as people is younger still. It feels like the merest change might again reduce those who require care to mere objects, to be serviced by unthinking yet attentive devices.

PARO then, in this view, is the advance guard of an robot army intent on caring for us forever in unflappable, whirring servitude while those lucky enough to need no care studiously avoid ever coming into contact with those who do. Robots are then a new way of keeping people out of sight and out of mind in an automated netherworld.

In reality even the most gifted and committed caregiver has off-days; all will have people they work for that they’re less keen on. The idea that your primary point of emotional contact should be a human who is being paid to be there to service your physical needs is not always necessarily a progressive one. For many, the point of having a human caregiver is to enable them to do the things that will allow them to spend time with the people they really care about; friends, family partners. People may not want the paid-for intimacy of a human carer. Human contact is not always redeeming of people; especially if the contact is non-reciprocal and person receiving the contact does not see themselves as in need of redemption.

Japan with its rapidly aging population is already making a choice of robots over immigrants to address its unmet elder care needs. Current economics of care infrastructure in the UK lead to more aggressive warehousing of people and to a dwindling of choice about what kinds of care and support people receive and when. For Japan robots seem to be the answer as there aren’t enough people to go round and Japan is not keen on immigration. As Mikiko Yamazaki, president of the Tokyo Voluntary Action Center says of older Japanese people: “”In their diary they write, I’m so lonely today. Everyday. Or they write, nobody visit me today again. That kind of very sad diary they keep.” The government, said Yamazaki, provides pensions and health insurance — services that cover older Japanese. What the country lacks is people.”

The created world con

Devices like PARO create unease for some because they are considered inauthentic; a huge con played on vulnerable people.

Simulations, people suggest, are at the cost of the real; that they are somehow a cheat. In the case of PARO it isn’t quite the Uncanny Valley, the way that we find unnerving something that is very close to human but not quite. It’s the idea that there is something counterfeit about the emotions evoked by designed objects as if their inauthenticity betrays the legitimacy of the emotions they create.

Human civilisation has been based upon creating mechanisms to artificially generate emotional responses. That’s what art is. And music. So why is it considered so problematic if a robot is designed for interaction to do just that for people who might be finding it difficult to access such emotional experiences for themselves?

The experience of holding PARO invoked a series of responses in me. Everything was precision designed to give me exactly the kind of feelings I experienced. Every element had been iterated and designed and tested. In fact, PARO is the 8th generation of seal type devices, the for-now best evolution. The eyes, with eyebrows, were expressive enough to make me feel I was being looked at. The vocalisations and movements were pitched at just the frequency and tone to make me feel that my touch was desired and valued. Even its charging lead is a child’s dummy; something that it will seek out, powering down into a seemingly contented doze as it sucks.

With created creatures and created environments we feel discomfort when we feel they might trick us into behaviour that is not rooted in the real.

If we think about our experience of the web, many sites and services appear to be our friends. They are helpful. They are thoughtful. They send us reminders; help us to find out of the way information; organise our deliveries for us; contact us to remind us to rekindle the embers of a dying friendship by sharing some news. They behave like friends. Except behind our back they are manipulating us to take actions that best fit their business model. On some level, this is part of the discomfort with something as precisely emotionally affecting as PARO.

PARO embodies a variety of something that might be described as the created world problem. In all artificially created worlds — such as those that form the basis of immersive video games — the objective for the designer is to create an environment that is as conducive to the pleasure of the gamer as possible while appearing to be a real and functional space. Everything that is placed in this virtual space is placed there for a reason; by a person. Clouds, grass, physics, reflections: all are there to contribute to the pleasure of the gamer participating in the space. The expectation of a gamer is that all elements of the game world have some purpose. As with PARO, everything present is there to stimulate the player, and more subtly, to guide the gamer through the most pleasurable path in the game. The more absorbing and total the illusion of reality is; the more difficult is to remember that the real world does not conform to this form of design. Gamers are often fascinated by game glitches; failures in coding that create events in game which illustrate the artificiality of the game world in which they occur by accidentally breaking the suspension of disbelief.

While artificiality and simulation is not a problem in a general sense there may be an emerging condition where understandings of the world previously gleaned from media such as film and the written word are supplanted by understandings of the world derived from the mechanics and experience of video games. The world does not rotate around our pleasures and desires and in very few instances does it guide us to an obvious pleasurable outcome. For the bleeding edge of such discomforts and their implications the increasing realism of sex dolls and the race to create viable sex robots might be a guide.

With an autonomous robot like PARO, the distaste is more that a created object — a proxy for the wishes of its creators — could manipulate so effectively the emotions of people. That PARO isn’t ‘real’ in the sense that a dog or cat or person is real leads some to see it as an electronic con trick, something making a fool of those who interact with it by tricking them into response that are inappropriate when centred on such an artificial device.

Holding PARO, this fur wrapped bundle of precisely designed stimuli, I did not feel that my free will or dignity had been compromised. As I handed PARO back to the man standing behind the conference display stand I tried to ignore its squeal of dismay at being manhandled, embarrassed by the nakedness of my emotions. Before I could leave I had to give it a final stroke, crouching to eye level to say goodbye. Dummy in its mouth I imagined that it would look to me as moved away, sleepy and tired at meeting a new friend. Sad, bereft, I walked on and out into the noise of the city.

I had fallen in love with a robot. But what I had felt was real.

Mark Brown

--

--

Mark Brown

Mark Brown edited One in Four, mental health mag 2007–14. Does mental health/tech stuff for cash (or not). Writes for money. Loves speaking. Get in touch