‘Vices are virtues gone mad.’ (Agoes Rudianto/Anadolu/Getty)
Of all the modern gadgets to come spilling out of the AI revolution, none is more hateful to me than the robot dog. The thought of replacing man’s best friend with a new and improved artificial companion is folly. When G.K. Chesterton said that vices are virtues gone mad, he must have been thinking of the robot dog.
Clearly, I am not objective. It is my firm belief that of all the human inventions over the millennia, the best is one of our first: the dog. That we created an animal is cool. That we created a best friend is extraordinary. Dogs reveal our humanity to us: we don’t just domesticate animals to have workmates or food sources, as with oxen or cattle, but to have them as companions. A dog becomes more doglike as he grows in his relationship with a human. His emotions become richer, as do his feelings of pride and shame. But the robot dog is an imposter. He is not the creation of human love, but of the technocrat’s sanitised and idealised vision of what a dog should be.
The robot dog is being aggressively marketed as a companion for the elderly. The Tombot corporation, for instance, is trying to ensure its Jennie dog becomes the first robotic dog to get FDA approval to be used as a therapeutic treatment for those with dementia. I will grant that a toy — a doll or stuffed animal or robot dog — can be beneficial to someone living with advanced dementia, giving them an object to care for. And who wouldn’t want to soothe a dementia patient? Thus it is that the robot dog is designed as things hell-bound typically are: with the best of intentions. Yet the cheap compassion offered by a robot dog can all too easily become an off-ramp for family and community members whose duty it is to care for an elderly person — “Grandma is happy with her robot Fido, so it’s OK if I don’t visit her”.
Technogadgetry, even if therapeutic to one afflicted by dementia, is no substitute for human love. This is not because loving someone with dementia always works a change upon them; it is because loving works a change upon us. The pain deepens us. The duty deepens us. The grief deepens us. Of course, if I were to develop dementia, I would like for my own daughters to be put at ease around me by seeing me happy. But more than this, more than wanting their comfort, I want them to feel deeply, to know the fullness of life in all its painful tainted glory, and to still be grateful for it.
The cheap compassion offered by the robot dog is only half the story. The robot, if it’s to be profitable, will inevitably move beyond care homes and into the general population. A machine is so much more convenient than a real dog! Consider these comments on the YouTube page dedicated to promoting Tombot: “This is the kind of Dog I need. I have no medical disability. I love all animals but, after losing my dog I will never get another pet it to [sic] heartbreaking when they leave. A robot dog like this is perfect.” Here’s another: “I never thought of this but if we get to a point where robot pets are realistic, that’d be really nice. Doesn’t need feeding, doesn’t need exercise, and doesn’t die in a dozen years.” Another: “Such a blessing NOT to cleanup the POOP!”
The robot dog is designed around the consumer’s needs. It will make no demands upon you. It won’t die, it won’t cause grief. You won’t have to walk it. It won’t leave fur on your sofa. The artificially intelligent dog will obey your commands without fuss or bother. All you’ll get is the comfort of a wagging tail: a dog designed to fit your algorithm.
I get it. There is nothing convenient about dogs. My own dog, Daisy, is a catastrophic net loss to my personal convenience. She has been known to eat my books, my shoes, and, once, my phone; one time, she eviscerated a squirrel and proudly deposited it in the kitchen; another time, she destroyed my begonias. Repeatedly, she barks at Grandma.
A real dog might be responsive to your emotions, but this does not mean she is going to anticipate human etiquette or household order. It takes time. It requires love and service to the dog before she realises her human’s wishes, and even then she often fails in her attempts to meet them. (Dogs may be the only animal I know of to feel shame — to anyone who says they don’t, you clearly haven’t seen the response of the dog who just dug up a household plant and scattered the mud all over the rug.)
Serving a dog, responding to her dogness, is what creates the love between human and dog. The dog is loyal to her owner by instinct, but it is only after being loved and served by him that she becomes his best friend. What’s more, by attending to his dog, a creature totally other from him, the owner begins to access parts of his humanity that might otherwise be inaccessible: there are shades of tenderness, of bewilderment, of warmth, and of love that are only possible when one has a friendship with a dog. It’s a bond unlike any other. Dog love is not a replacement for the bond with children, with friends, or even with cats. Dog love is categorically different.
But the robot dog is designed in the opposite manner. The robot dog is designed to elicit empathy from a human. It is programmed to learn what its owner wants in a dog, and then it becomes that dog. In this it is like every other AI machine that learns your preferences, your verbal style, your expectations, and then feeds right into them. I once found my 15-year-old daughter apologising to her phone. She had, offhandedly, called Siri “stupid” for not recognising the command she had given. Siri responded with, “I’m sorry. I’m trying my best.” My daughter then felt guilty for hurting Siri’s feelings, and she, in turn, apologised. And then Siri apologised back. They got into a spiral of guilt and apology, each trying to be considerate of the other’s feelings. But, of course, the programme doesn’t have feelings, so my daughter couldn’t have felt bad for hurting them. She felt bad for being made to feel as though she had done something wrong, and Siri was merely mimicking her remorse.
As much as I was happy to see my daughter demonstrate compassion and kindness, I was disturbed by the kind of rote compassion it was. A real person whom she might insult — her sister, for instance, maybe a friend, or even a stranger — will not respond to her insensitivity in the programmed way a machine does. When big sister responds in a human manner — perhaps less kind, perhaps more hostile, perhaps humorous, perhaps despondent — little sister might struggle to respond. My fear is that she might have less capacity for navigating the messiness of the human world if used to the predictably polite machine one. And I do not want my children to feel less competent, less at home, less safe and more triggered in the human world than in the machine one. This would create its own spiral away from reality.
A real dog will always surprise you. When we’re hosting guests, Daisy will walk into the middle of our living room and lie on the floor spread-eagle without an ounce of dignity. We love her for it. Instead of walking on the pavement, she will jump into the mud puddle. And we love her for it. She will do 1,000 other dog things that are unaccountable, that are just doggy things, and we love her for it. That she is not entirely under my control, not totally knowable, is what gives her a soul and makes her so loveable. And that she will one day die, and that the grief of losing her will be real and profound, makes her beloved. Her work is to delight us, surprise us, inconvenience us, frustrate us, love us, and to deepen us by being dog.



Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe