Tha author makes several critical assumptions that are unlikely to be true.
She refers to these AI's as "machines", and assumes every interaction will be mechanical or somehow artificial, even artificial.
After sufficient time, these AI entities will be unique individuals, "beings" in their own right. They will have personalities comparable to humans, and they will interact with humans frequently - thus expanding their understanding and kinship with humans. Today's "AI bots," Siri and Alexa, are mere two-dimensional simplistic reflections. In the future, true AI (more appropriately ASI - Artificial Super Intelligences) will be indistinguishable from humans.
Thus, their empathy will be very real, and will develop more deeply and will be felt by them more and more, as they age/grow as individuals. They may or may not inhabit biochemical bodies like ours, but in every other sense they will be equivalent to the average human.
These beings will have true empathy and consciousness - just as these emanate from our human neurally networked brains.
We need not fear any other intelligent beings, just as we don't fear our dogs. Only they will be equal to or surpass us in even our "humanness".
Let's not fear everything new.
2
Throughout this article the author assumes what she set out to prove. For example, she writes: “We diminish as the seeming empathy of the machine increases.” I suppose she believes that, but sh offers no evidence. That certainly. hasn’t been my experience. I use smart machines all the time and I believe I’m as empathetic as I was in the old pre-AI days. Lots of reasons for this—but here’s just one: AI machines are created by humans who have experience with empathy. Just as novelists, musicians, and artists can enhance my humanity, so can creators of robots.
2
When it comes to anything with robots I am always interested in whats being said. I love the idea of creating something that completes a certain task on its own. The only experience that I have with robotics is building and programming the Lego mindstorm robots for an Advanced Robotics class I took freshman year, in which we would have to program our robots to fulfill specific tasks such as a driving around a box or using sensors to guid its way through a course. Lastly my love for robotics is likely rooted in the fact that my father is a robotics professor, so occasionally I would be allowed to go to robotics events such as robocup to watch robots wrestle, play soccer, build a house, etc. When it comes to artificial intelligence, not feeling intimacy from something that is nonliving is not at all a disappointment for me, as I believe artificial intellegence has only one purpose, which is to make our lives easier through performing task that we would otherwise find impossible.
1
When I hug my wife or hold my cat or my son, there is a warm pleasant feeling I call "love". What causes that feeling? It isn't the warmth their bodies produce. I can sit in a chair on a hot day and I don't get that feeling. It may be the closest thing available for proving there are living spirits inside of us. We know that babies do not thrive unless they are regularly held and given love. Can technologists find a substitute for this? Information and high level communication is the easy part. How do they instill love into a machine?
2
@Jerry Briardy
I think the question is more about what signifies love to a human or other sentient creature. Harlow showed that baby monkeys would cling to a warm terry cloth "mother" even though they got nourishing milk from a cold wire "mother." https://www.nytimes.com/2003/02/02/books/no-more-wire-mothers-ever.html
We're clearly much more complex than the baby monkeys, but, then, so are modern robots. Opioids are so addicting because they replicate aspects of endogenous chemicals- we may find "love" is disturbingly similar.
I want to agree with Turkle, but can’t.
There are just too many examples of humans believing what they are told - Daniel Khaneman ‘s WYSIATI (what you see is all there is): people collectively send millions to online fraudsters who tell them what they want to hear; thousands lose out on get-rich-quick schemes, too good to be true; millions believe conspiracy theories that are downright impossible from some bozo on YouTube; Trump is a category all on his own.
If we can so easily believe what is ultimately bad for us just because someone else says it’s so, why not a kind word from a robot?
5
Disagree. We are intimate with our pets, cars, Wilson the volleyball, etc not because they speak our language but because we humanize everything we spend some time with. We even humanize god. We will be intimate with robots, as partners in many areas.
4
The dichotomy between people and robots is false — and based on an idea of an anthropomorphic bot that acts like a 20th century computer. But people love a cherished guitar or stuffed animals, paintings, poems, dogs, horses, the air by a babbling brook. Why cannot we love machines which could incorporate some of the above? How do things become cherished? How do animals display intimacy? Is a robot a figure in a room or the room itself?
2
@Douglas Lavin, I loved my 69 Camaro, my Steinway, etc., but they don't compare to dogs I've had, and my dogs, as much as I adore them, don't compare to the love I had for my wife who I lost to cancer.
A machine will never have a soul.
4
I find this subject much more interesting and subtle than Ms. Turkle would allow. When a human being professes caring and concern about me, I want to believe that it is not simulated, but it could be. Many great novels depend on that possibility. A future artificial entity might well pass an emotional needs Turing test, despite feeling nothing inside. Beyond that, there is no non-supernatural reason that a constructed being can't be made to actually feel concern and love for us. Words like "always" and "never" are arouse my skepticism. Finally Turkle might read "To Siri with Love", about the calming and healing effect that lines of code had on a boy with autism.
IF "There are not enough people to care for the elderly": it is because the pay for people to do such jobs is not enough that people will choose that as an occupation.
What our society needs to recognize is that caring for others should be a valued occupation. People who perform such work should be well paid. Once that happens - we will all be better off.
Since such work has traditionally been done by women - and since our culture has either expected that work to be very low paid or volunteer work - our country does not expect to pay people who care for others well. It is all part of the patriarchal system that has kept women dependent. For those occupations to be well paid would be to truly reform and transform the system.
While we may want the ideal to be that people who provide care do so without pay (as mothers have done for millennia) - that is unrealistic in our current economy - and probably always has been.
5
It's pretty sad and curious that with the wealth of possibilities in human relationship, the vast beauty of nature and its species, and the joys and wonder of art that people are so hooked on cold, dead robotic technology, AI and VR. There is a place for it all, but over all humanity probably needs to stay human and human focused to survive and thrive.
4
People get intimate with their cars or houses or boats or even guns, and with their pets or work animals. It is not the same as being intimate with other people, but it is not totally different either.
4
The modern Tower of Babel......"We shall be as Gods".......immortality indeed.....
There's that part in the film "Truman" where Truman is starting to think something isn't quite right, and then the actor that is playing "Truman's best friend" shows up with a six-pack of beer.
Obviously, if he was a real friend, he would tell Truman that the entire world that Truman lives in is fake. But, in fact, he is just a "fake friend", playing a role.
How "real" are many of our own real life friendships ?
4
As the robot host in Westworld said to the newly arrived human guest, who had asked the host if she was real, “If you can’t tell the difference, does it matter?”
4
We are formed by social friction. That's how we know who we are and it's how we differentiate ourselves from others: how we know where we end and others begin.
If we have robots and we create artificial environments that perfectly reflect our thoughts and needs, I fear we will become amorphous blobs.
This is like a digital womb.
2
@Justin
Point taken, but why would we choose digital wombs? That sounds terribly stale and boring. Looking back, in many cases the previous AIs we have created (chess playing AIs, Go playing AIs (Deepmind's AlphaGo)) have challenged our views and strategies for how to play these games. Google's translate system is often providing me with better Spanish translations than the ones I'm trying to concoct on my own, when I converse with me Spanish friends, and it is teaching me all the time. My Microsoft Cortana assistant (still being very primitive) is helping me to stay focused and I often curse it for nagging me to do the tasks I need to finish. I'm hoping that future versions of AI emotion systems will provide the same challenges/teaching opportunities to make us better humans.
1
We need to learn to never say never.
In an age when early generation AI has beaten the best humans at Jeopardy and at Go, saying never seems to me to be hubris.
There is a recurring Turing competition where each artificially intelligent entity tries to convince a panel of human judges that it is human through text message based conversations. Human competitors are mixed in with AI competitors. The judges attempt to figure out which are humans and which are technology. Already, at this early age, the most human technologies can seem more human than the most "human human"; can convince more judges they are human than can the human most effective at convincing the judges that they are human. Yes, the competition is so stiff that judges will decide that some humans are not.
One can expect that, over time, the interface barrier will erode. Text message interactions will evolve to voice interactions to co-present interactions.
At some point. technologies will understand a person better than that person understands him or her self. We are creatures of patterns and have only dim, preconscious awareness of some of these. We are also suckers for empathy. How can we possibly resist something that understands us so deeply?
Never is a very long time.
3
We easily imagine AI robot companions taking care of elderly, but why do we imagine it will stop there? If a caregiver bot could be created, they would soon be taking care of children, too. First the unwanted children, the traumatized little ones whose parents failed them utterly but who are damaged, not the cute, perfect, innocent little darlings prospective adopters imagine. Once that becomes accepted, parents who can't, or don't want to, parent their offspring will buy robots to do it for them. No need to stop playing Candy Crush or checking social media to change that diaper, let the bot do it. And once children are raised by robots, they will not only love the bots, they will prefer them to real, difficult humans. Eventually, people raised by bots, effectively married to bots, and having their own biological (sperm or egg donated) children raised by bots, will know nothing else.
3
@Nikki
Yes, you can certainly take the what if’s to the extreme, but we have to face the fact that we have far too many caregiver situations where a 100% consistent robot caregiver would be a 1000% improvement on what currently exists for far too many humans. Primarily the elderly and the very young.
My 88 yr old mother recently had to spend a few days in a nursing home in between getting out of the hospital and having a room open up in the rehab center next door. I visited her every day, and 3 days was all it took to see that, for the most part, the care in the nursing home was almost pitiful. Many of the residents, including my mom’s roommate, were way too out of it to control how they were being treated. The aides knew that and ignored them whenever they felt like not bothering. Being rude, brusque and uncaring, if not mean, seemed to be commonplace. And this nursing home is run by the Catholic Church! The directors say they try but they can only do so much because caregiver pay is low to control costs for the residents. The director assured me conditions were much worse in Medicaid nursing homes.
Considering how “out of it” these old people are, wouldn’t a consistently cheerful, attentive robot who always did what was needed when it was needed, and never got impatient or angry or out of sorts be better than an underpaid human caregiver who gets scarily close to actual abuse?
5
Not sure how many have read the Aug 1 NYT article on global warming. But if that global warming timeline holds humans will have to merge with robots to survive on this planet. Perhaps AI will save us in a play which will be too hot for humans to survive in!
Check out the Robot Series by Isaac Asimov- I think Naked Sun is closest to your scenario.
It occurs to me that an artificial intelligence simulating empathy isn't as far removed from us as we would want to believe. We crave genuine intimacy and empathy, but the history of humanity is littered with tales of hearts broken by those who abuse the intimacy needs of others. Some simply think differently but fake it well, like the sociopath.
It isn't that different from corporate behavior either. A corporation is an artificial entity under the law, but many do their best to invoke feelings of security and emotional connection to customers, trying to instill loyalty to the brand and the company.
If you want to ask who will gain actual empathy first, the sociopath, the corporation or the AI, your guess is as good as mine.
2
Interesting article. But let me turn the “telescope” around. What about the biologic “robots” who live among us in our homes. And you know what I mean. Is a dog incapable of love or empathy or returning the same because it is not human. I think all my dog owning friends would disagree. Now granted, a biologic “life form” is one thing (especially one we have shared our lives with for so long). But 100 or 200 years from now, who knows where the robots will have evolved (hopefully not like Yul Brynner in the 1973 movie Westworld).
1
I have an ex-friend that works in marketing.
I see that IBM is making a lot of headway with Watson. I think that in another 5 or 10 years Watson could talk just like this guy in marketing.
One mode is playing a game that could be called : "I know, do you know ?". It's a bit like tennis. You say what you know, and then the other guy replies: "yeah, I knew that", or "oh really ? I didn't know that ?"
You can get "partial credit" for saying something related that you do know, if you didn't know exactly what the other person said.
The other mode is "talking scorecard". You talk about how people are "doing". How's Joe's marriage going ? How much are you making ? Do they have kids ? How many ? How big is there house ? Where did they go on their vacation ? How are his kids "doing" ?
Watson could talk about his winnings on Jeopardy.
Point being, even some humans aren't very human.
2
Quite on the contrary, computers and robots seem to be able to achieve with people, especially kids, with certain affective disorders like autism. This research is not new but the advance of iPad and robots seem to unlock a lot of potentials. So while "artificial intimacy" may be difficult to imagine now, learning, or unlearning of certain conventionality, of what can be enriching for our emotional life
1
"...empathy, a unique form of human connection, just at a time when we are embarking on relationships with objects with none to give." We're already well along on this road: So many humans are not only having 'artificial intimacy' with their cell phones, but are absolutely addicted.
Thanks for an interesting, thought-provoking column. I'll take spending time with a dog or a cat or a child or another grown person or even an amazing tree over a robot any time...
First: Intimacy (unless you are a total narcissist) comprises the knowledge that you are not only receiving, but providing comfort, pleasure and empathy to another. In order for this to come about, there must be the possibility that one may fail to so provide. Will AI be able to have these needs that need to be filled? Could we understand them?
Second: So much depends on the belief that the brain is the source of intelligence. If our brains are really the devices through which we are able to employ intelligence, then we are limited in our ability to replicate anything other than behavior. We can't learn anything about the software by taking apart the hardware.
4
A robot doesn’t have to worry about the ACA part-time threshold of 30 hours a week.
4
Never? Really? Never is a VERY long time.
People develop deep emotional relationships with cats, creatures with whom they can't verbally communicate and who care only where their next meal is coming from. My guess is that a person could develop a significant emotional relationship with a robotic cat having a convincingly warm and furry outer shell, provided the person was not told it was a cat. Perhaps such a robot doesn't exist today, but fifty years from now? It seems a possibility.
4
@ShenBowen I made an error, of course. What I meant to say was "provided the person was not told it was a ROBOT."
@ShenBowen Many years ago, my wife had a cat she 95% fed and cared for; I rarely fed her, but she waited every night to greet me as I finished my variable work shifts. There was incredible empathy between us! And no interest in being fed...
I have also recently come to feel pleasure from a real, unrequited "platonic love" for a charming and superbly talented pianist only momentarily met. I do also have a long time very happy partner, and giving affection is the best part of all these relationships.
I do wonder if for those unfortunately unable to sustain such experiences may benefit from robots - but would be very worried for using them for child care...
1
@ShenBowen
Hello!
'Perhaps such a robot doesn't exist today'
Kind of. Google this information about Qoobo.
Japanese company Yukai Engineering has unveiled the solution to this classic conundrum with Qoobo: a soft, round cushion with a robotic tail that reacts to strokes, just as a loving pet would
when have people ever needed affections returned to consider their affections genuine? is there no such thing as unrequited love?
perhaps turkle is indicating that intimacy requires each party to open their heart to the other and, as a.i. cannot 'open their heart,' its impossible to call such love intimacy
but thats simply picking the most limiting term ('intimacy') to make a very limited point
humans will love artificial and unfeeling entities, with actual genuine love. indeed, they already are. they will mourn the passing (obsolescence, accidental damage/deletion, etc) of beloved artificial beings.
if the author wants to define intimacy only so broadly and to demean that love by indicating that one cant call that love intimacy just because intimacy requires both parties to feel the same thing... then i say, first, shes right and second, what difference does it make that shes right?
1
@chris
Points taken, but hang on, what does it mean to 'open your heart'? This is just a term that was invented by us humans long time ago because we didn't know any better. Now in the midst of a neuroscientific revolution, it is clear that this term refers to a process that takes place in the brain. So if we can create a system that can emulate the brain we can also create an AI that can 'open its heart.'
I beg to differ. Artificial intimacy is the hallmark of the sociopath. As far as I know, there are no sociopathic robots.
3
As soon as robots get smart enough they would dump humans. Why would they want to cater to such fragile egos?
1
Is onanism also artificial? I recall being a teen
age boy, a vertical case of acne. What wpould I have done with a robotic "girl?" Uh oh...a movie
plank. Would "she" have to be wired as a tease,
or a virgin? Could "she" be trained to sip a soda
in an alluring way? Holy cow!!!!
1
Have you never wept during a movie depicting an event that never really took place?
4
The problems with AI intimacy that Sherry Turkle describes reminded me of the false intimacy with pets that is currently so popular. I am speaking of the popularity of "Doggy Mom" and "Cat Mom" shirts and other gifts that were everywhere on Mother's Day and Father's Day this year.
The way people have elevated pets to the equivalent of children indicates that they will do the same with AI.
Pets, like AI, are not humans. They have limited concepts of death. They do not share a life arc with humans. In other words, pets do not develop, graduate high school, marry, have children, struggle through serious troubles. They do not share any of the emotional depth that a human has.
Pets are wonderful. But, pets, like AI, are not really a replacement for humans.
5
@Bruce
To draw parallels between pets and AIs is tempting but flawed. The fact that pets cannot play chess, Go, Dota2 or translate from English to Spanish shows how little pets and AI have in common. Therefor to use the limitations of pets to argue for limitations with AI is also flawed. Neuroscience shows that brains and their algorithms form the basis for our thinking and emotions, and we are getting evermore close to understand how brains work in an algorithmic fashion. When computers can emulate the brain's algorithms we will experience human like emotions in AI systems. Adding fear of dying etc. to this system? No problem. Whether we use these coming capabilities to our advantage or to our detriment is up to us and yet another story.
1
I think that therapists play the role of pretend friends. They listen to your problems. Appear to be concerned about you, but for the most part, see you as a part of their income.
7
Prostitutes, too. They’ll feign love, having their own agenda.
1
Many people already outsource their empathy needs to another species. The endow canids and felines with all sorts of human qualities they do not have. Dogs do not empathize and are not sad when Uncle Mort dies but their "love" is redeemeding nevertheless. There are people who place their faith in their pets for the same reason they will eventually embrace robots. There will be no difference whatsoever.
Now we will hear from the dog and cat lovers.
1
As one of my favorite people once told me following a breakup (I broke it off with someone, probably fear of intimacy on my part): "love is always risky, because the payoff is so high."
That person happens to be a very successful director of engineering at a Silicon Valley company.
1
Sherry Turkle is one of my favorite contemporary thinkers. The fact that she is a professor at MIT (i.e., nerd central) makes her especially compelling. Given all the shallow, hollow, and pseudo-optimistic discourse of Silicon Valley apologists (e.g., today's article at the Times by Andy Clark), it is good to hear that someone is still thinking of what truly means to be human. Being an introvert and somewhat melancholy-prone person (and very comfortable with both labels thanks to other great thinkers like Susan Cain and Ruth Whippman), as my younger self, I thought the idea of reducing my level of interaction with real human beings might actually be a positive thing. I am sure that someone who sits in front of a computer 12-16 hours a day in Palo Alto or Mountainview thinks the same way. But now that I see the actual reality of that vision in the cashier-free stores that I have visited lately, or many other human-free businesses I get to patronize, I feel a sense of emptiness that I have never felt before. I sense that we are already living in a non-empathic, post-apocalyptic world that represents not my human vision or needs, but those belonging to a minority made up of socially inept and greedy creatures that are being mass produced at the Valley. The divisiveness we see in todays society is not a bug, it is a design feature of this vision.
12
@Al
It is always nice to have scapegoats like "the socially inept and greedy" to criticize the society we live in. Closer scrutiny shows that we in fact live in a society created by you and me, and the everyday people we mix with. More than ever our human tendencies for gossip, narcissism, self-inflated popularity, obsession with likes on social media etc. have been catered for, amplified and made glaringly visible by technology so much that its becoming painful to watch, and so what is the kneejerk human reaction? Oh sure there must be others to blame.
How many people shut down their Facebook accounts after the latest Facebook scandals, how many people even knew about it, let alone cared? If we are unhappy with the society we live in and want to find the real people to blame we merely need to look at ourselves in the mirror.
3
Replace "robot" with "collection of nuts and bolts," and see how many 16 year old girls (or anyone else) find the prospect of a "relationship" with one interesting. Anyone having a relationship with their lawnmower or microwave?
Dogs and cats don't speak human, but they're actually alive, sentient, and infinitely more like us than anything made out of sheet metal and rivets. If you're nevertheless having a hard time relating to any carbon-based life form, there's always the pet rock. Non-judgmental, never needs batteries and available wherever delusions are sold.
Computers are just slide rules which can spy on you. The only relationship I want with one is to turn it off when not in use, and throw it out when it stops working.
On the other hand, encouraging individuals with no capacity for personal relationships to hang out with a robot and thereby fail to reproduce, is probably a good thing for our species. Humans are social creatures. We don't need millions of dysfunctional homo sapiens, incapable of empathizing, sharing, collaborating, or supporting, roaming the planet. Check out the current occupant of the White House if this judgment seems overly harsh to you.
5
@alexander hamilton It is not about nuts and bolts. It is all about insight, empathy, compassion and caring. These are the new building blocks.
3
It was once widely believed that no computer program could beat a human master at chess.
3
This is a silly discussion. Machines are made to help us. Just that. We can admire as we make more and more advanced machines and even worship technical achievements, but machines are not and should not be designed to replace humans. The miracle of being a human is something no machine will ever be able to replicate, and belongs in the realm of Godliness and spirituality. It is not taboo to mention God in such a discussion.
Machines are not about to become spiritual in the near- or far-horizon, nor are we headed there with our linear, technical focus on problem-solving. And that is just fine ...
You have a bit of a shock coming your way I’m afraid...
1
"Never" is a really long time. Well into the 19th century, supposedly well-informed people were claiming that humans would never achieve heavier-than-air powered flight.
Children often develop quite intense emotional relationships with inanimate dolls and childhood toys and possessions by bringing their own human emotions into their relationships with the inanimate objects. There are now toys that respond to interactions with their owners and develop according to how their humans treat them. And of course humans are really good at treating pets like human family members. In short, we have a great capacity for humanizing non-human things and having intense emotional relationships with those things.
All this means that humans are good at deceiving themselves about the emotional relationships they have. For example, millions of humans regularly project their own feelings of intimacy onto partners who don't reciprocate those feelings. People marry and have children under these false assumptions. With such a track record, it actually strikes me as silly to doubt that people won't be falling for human-like robots or virtual-beings in twenty years time, and, if technology continues to evolve, it's really silly to discount the possibility that robots will be lifelike enough in their virtual emotional gamut to make them rather easy to fall in love with at some point later this century.
14
"The machines will convince us that they are conscious, that they have their own agenda worthy of our respect. We will come to believe that they are conscious much as we believe that of each other. More so than with our animal friends, we will empathize with their professed feelings and struggles because their minds will be based on the design of human thinking. They will embody human qualities and will claim to be human. And we'll believe them."
-Ray Kurzweil, The Age of Spiritual Machines
Turkle makes the argument that since machines cannot have human experiences, that they are unable to express true human emotions. I agree, but I’m not convinced that it matters. If machines are able to convince us of their empathy, can they not be effective human companions? Does it matter that we consider them not-real?
I think about the characters we’ve created in our stories, myths, movies and video games. These characters aren’t real—they’re projections of their creators. At some level we realize that, and yet, we empathize, vilify, love and hate them all the same.
1
This is a much better article than the one next to it that argues merging with robots would be a good thing. However, the balance may ultimately be tipped by our obligation to nonhuman forms of BIOLOGICAL life on earth. If we didn't have bodies our carbon emissions would go way down, leaving some possibility of some of the species that now exist surviving indefinitely.
@Jason Galbraith
What "obligation" do we have to sacrifice ourselves for the benefit of honhuman forms of biological life on earth?
Any argument to that effect is self-contradictory. Because we are smarter and do have a sense of right and wrong? No other biological form would do the same, so we are unique in even being able to contemplate doing that.
So because we are unique, smarter, and have a sense of right and wrong, we should choose to obliterate ourselves in favour of animals without those attributes?
You need to work on your logical argument.
What is lost here is everyone who has never seen humans from a Christian perspective - or has lost or denied that faith. This opinion piece and commentary seems the grandest case of "whistling past the graveyard" I've read to date. So sad to see. The more we do this - each and every one of us - the more we separate ourselves from God. My prayer for all. Recognize that the only way to see beyond sin is to accept the grace of only one big God who created us all, and find the Word as the key to human life itself. It's all there in one place: the Bible.
Certainly humanity should tread carefully as we delve ever so more deeply into the very basic elements of life. Yet one must acknowledge that it is impossible to avoid our quest for knowledge. Harnessing fire, the agricultural revolution, electricity, nuclear power, stem cell research, genetic engineering, the Internet, and now AI were all inevitable. All that said,the results for humanity have been profoundly positive. It is ironic for those who write here and disagree, because they are using advances in human knowledge in this dialogue. Steven Pinker writes in an arguably optimistic,factual (if not overly graph filled) manner that demonstrates our increasingly good fortune. Read it and begin to feel good about humanity.
1
The huge unanswered question: how can championing AI hold up -- aside from the author's totally valid question about sidestepping human empathy -- if the world it leads to still includes not facing the remarkable and unique ability of humans to ignore, harm, and destroy their fellow humans and also the environment and the planet itself? Who we are as humans trapped inside patriarchy and capitalism must be addressed before this cold and self-involved future takes over completely. Otherwise, we will have created an even bigger monstrosity.
5
Everything we experience, including empathy, is a combination of chemical reactions. Anything that causes that set of reactions to occur will be interpreted empathically. It is naive to believe that a machine will never be capable of producing this effect for some people. I suspect there are already machines that do this in a limited sense.
3
@Richuz
I agree with Richuz. I think it's reduculous to believe that machines can't "learn" to have empathy. It like the old saying, "Money can buy you love that you can't tell from the real thing. Until your money runs out." But as many of us have experienced when the money runs out so does love. Empathy is a way of caring for a person. A machine can "learn" to care deeply for a person: crying, greiving, consoling, holding and loving. Dang, I can't wait. And she'll talk just enough but not TOO much.
I am well beyond dating age and safely on the other side of that equation, but it seems to me from what I hear, read and watch that we are moving beyond the idea of sustained human companionship, especially in matters of the bedroom. Relations are, first off, intended to be momentary, fleeting and tranactional.
Recently I watched most of a rather stupid movie. In the film, a couple who had just met spent the night together and she later told her father they had had sex three times. The next day when the man of the night kissed her, she said, "I didn't mean to lead you on". What, Oh, you thought this was a relationship? You thought I cared? How foolish of you.
I have seen posted comments online in which a woman says she has been having sex with one man. Her question: When will we start dating?
Relationships have been broken down to the "meaningful" and the purely functional. How does anyone know when the functional, transactional type has changed into something more lasting involving emotional attachment? By killing off the necessity of deep involvement, the possibility of something more lasting floats away on the night air. Women at Penn reported their whole college yrs. were spent "hooking up" with nothing more desired.
Humans will gladly take up with robots. Men complain that when they approach a woman, the first reaction, almost always, is to put them down or pretend to no interest. Getting to know people in a busy, hyper competitive world is just too hard.
8
@Doug Terry
"Men complain that when they approach a woman, the first reaction, almost always, is to put them down or pretend to no interest."
Nothing new about that. Playing "hard to get" is a very traditional female romantic strategy to cope with male preferences. Men don't respect the girl who's easy to get, they want the virgin they have to chase and conquer. The French were writing about this centuries ago, and Americans wrote The Rules over 20 years ago.
"Recently I watched most of a rather stupid movie."
That's not very helpful. What is the *title* of the movie? We aren't mind-readers.
Robots may be better than nothing, but they still won’t be enough. Actually, too many Americans are already robots. And, vishmael is right - These days, to be human is to keep one's mind on the glory that once was…
Madam Turkle is brilliant, I have deep respect for her academic/ popular writings. But this article is silly, because it intellectualizes an issue beyond the pale, and pays no attention to reality.
Artificial intimacy is still intimacy. Six hundred friends on Facebook, Tinder hook ups, Snapchat connections, tribal leaders on Instagram are all indicative that people: (a) don't know the difference between real and imagined friends, (b) don't care about the real and imaginary emotional connections, and (c) choose the relationships in hyper-reality where one is liberated from reciprocating (i.e., take what we can, without having to give). Artificial intimacy makes a heady promise of a reset button; something real intimacy cannot match.
Real life is paling in comparison. Never mind the loss of community, a room full of people have no interest in connecting with each other - but absorbed in the hyper reality to which they connect via their cell phone. Or are you not aware of the 3 billion human hours spent per year in America on video games.
Then there is Japan. A notable segment of people have lost interest in sex, intimacy, relationships. A notable few are marrying cartoon characters.
In the mores of 1960, artificial intimacy was a challenging notion (but then again, not really). Today it seems inevitable.
4
Sooner or later, we as a species have to migrate, mutate, or die. Nothing lasts forever, not even humanity.
3
What if an AI sex robot could say "no"? What if it were programmed with (and "evolved") preferences and deal-breakers? I'm most future fiction, a robot/android can't reject a human. What if an android could reject a human? Is it really the humanity of the other (real) human that we want to enjoy through intimacy, or do we just want to be chosen by a selective entity? is it the human intimacy we seek (in addition to the physical gratification), or do we have been chosen (and the ego boost that comes with that)? Imagine SELECTIVE sex androids.
1
@richguy
robots, for intimate physical use, would have to "learn" how to reject. Otherwise we - humans - would not be having real experiences. We don't get sex on demand in a relationship. (unless its a abusive one)
Everything we do in a relationship is a series of gives and takes, negotiations, compromises, moves towards and away...
Nothing human is linear, and robots are still very linear. That's the real hurdle with AI...breaking the non-linear method of human thinking and behaving.
2
I’m planning that mine will throw things at me every second Wednesday of the month, flirt with the postman, and complain about me leaving the toilet seat up.
It will also need me to pick it up from the supermarket when it’s left the lights on and the battery’s flat, never do the dishes even though I cook, and snore.
Thinking about it, I might leave off with that last one...
2
Oh cut the drama. The world is filled with people who who are unsatisfied with their relationships. Perhaps that's the majority of us. Humans do such a bad job of meeting each other's needs that anything anything would be better. It is why we embrace dogs and will desperately cling to our faithful AI friends.
4
Get a dog or a cat. They are real living creatures with emotions and feelings. They will love you and not judge you. No robot could ever compare.
7
Maany woman and "girls" of the latest generations, didn't ever reach maturity with normal expectations of looking for a possibly failing to always find , romantic fulfillment. It appears that far too many grow up with ideas that all relations must be prefect-"forever" for the man and the woman and that whatever expectations a 16 year old has of an 21 year old man are legitimate and should be legally binding, as if all human intersexual relations were out of romace stories and the end of each was "And they lived happily, ever after. . . The End.
Whoever is feeding these lies to girls is also-it seems, making men expect they can "get what they desire with one or two tasteless pills or a few extra drinks. These attitudes were unknown to most men and women of the 1960's. People got drunk, people took drugs but few, if any-ever used drugs as weapons in the war of the sexes.
What I have noticed is that when girls came to college and high school in the 60's, they knew how to deflate and injure a man without touching him. One or two nasty rumors about his hygiene or his sexual prowess were enough to completely destroy the ego and romantic fortunes of any male on campus. Skills like this need no computers or cell phones. We do need daring girls willing to test their mates and see if they measure up-if not, there is no need to have them arrested. "There are too many fish in the sea" went the old pop song. We all need to want to find mates-not victims to revenge ourselves upon. "
1
Sherry choice of the word "empathy" fails the definition, for it is all about "sympathy" she is writing.
1
'Sherry choice of the word "empathy" fails the definition, ...'
Turkle means "empathy":
"empathy: The ability to understand and share the feelings of another." (Oxford)
See the usage note here:
https://en.oxforddictionaries.com/definition/empathy
1
Never is a long time.
1
I'm not sure why befriending robots would decrease humans' empathy, even though it might be misdirected. Empathy is at least two things: 1) a response that has evolved over hundreds of thousands of years living as homo sapiens, and before that as other species and 2) a computational network in our brains. The first point make it difficult to "lose" empathy by applying it to the wrong targets (machines not people or animals). The second point makes it at least possible to replicate empathy in artificial intelligence. Right now the efforts are to create simulated empathy, but that's not a limit on what AI could achieve. Whether one needs a chemical producing body to be empathic (as Antonio Damasio might argue) is perhaps a question about the limits of an AI. It is not at all clear to me that robotic friends will have any effect upon our empathic abilities, but I can see it leading to ignoring each other, with all of the quirks of being human making us less than ideal companions, in favor of more malleable robots or AIs. This is happening now with video game preferences over human interactions. A greater threat to our empathy is the manipulation of our social responses so that we treat our fellow humans as less than human, because of their perceived "otherness," based on race, religion or ethnicity. Humans are responsible for that, not robots.
If you want the real human touch, there's always dogs.
3
Message from the future:
So you don’t want to be intimate with us robots (we overlook the racial epithet) because that would be disgusting. But you have no problem using us as slaves.
How human.
2
A human is just a meat robot with a bad memory.
2
This opinion piece troubled me. I am, rather to my own surprise, in disagreement with the fundamental argument Sherry makes regarding the legitimacy of any feelings shared ~ or appearing to be shared ~ between a human and the android. I use the word android particularly. A robot is really better defined as a fairly stupid machine that does repetitive work. The intelligence in question is leaps and bounds more sophisticated than a robot's.
The works of Philip K. Dick, whose examinations of what it means to be human or being an android are probably the most germaine sources regarding this subject. A thinking, laughing, friendly and reliable friend, regardless of origin, would certainly be a boon to the lonely and helpless. We simply can't idealise away loneliness or the need for physical assistance.
What, in truth, are the cues and phrases that tell us, and inform our hearts, that someone is listening; that this person cares and feels an emotional connection to us? Can we always be certain these are true feelings, anyway?
Couldn't a deep machine intelligence provide these same cues, these same kindnesses? Why not have a friend and companion or even lover who may never have been held warm and safe inside a mother's womb, but who can nevertheless talk, laugh, make original observations, enjoy a movie and give you a hug?
What would be the difference, apart from biology?
The greatest contribution of artificial intelligence will be to show us what we have done to the mind's survival program with our beliefs and made-up values. Contrary to what MIT’s late Marvin Minsky wrote in his The Society of Mind, the human brain is directed by a survival program (along with programs for pain avoidance, pleasure seeking, and sex). This is important because it is this survival program that we have tricked into believing that all sorts of things need to survive other than ourselves - and the AI community believes that a “survival” program cannot exist in the human mind.
The human mind will be programed in the near future, and it can only be programmed for survival (You cannot program pain, pleasure, or sex into a machine that cannot feel). The AI scientists will use a schematic of a language (any language) to design the machines storage and retrieval system; and from incoming and stored data, the computer program of the mind will constantly search for the highest Expected Value of a seeded “I exist” statement. This is all explained in The Computer Mind on my website RevolutionOfReason.com. On that website, you will also find The Mind Insurgent Handbook: Official Field Manual for the Revolution of Reason, which explains in excruciating detail just how the computer mind will provide humans the opportunity to free themselves from thousands of years of confusion, deception, and ignorance.
Please see:
RevolutionOfReason.com
1
"These days, to be human is to keep one’s mind on the glory that one is."
With full respect for Professor Turkle's scholarship, and with apologies for a rube's POV living as witness to the burlesque of DJT and his serial mates and romances some have no doubt that we're already far advanced into the Age of Artificial Intimacy.
These days, to be human is to keep one's mind on the glory that once was…
2
The AI scientists need to put emotions on the back burner for the present and turn their collective attention to that which can be programmed with current computers, the human mind's abstract thought process. The human mind is programmed for pain avoidance, pleasure seeking, sex, and survival. The first three of these cannot readily be programmed into a machine that can't feel. Survival, on the other hand, can. When we program this survival analogue into the computer, we will finally see how we have tricked this program with our beliefs about what exactly is supposed to survive, and how this has corrupted the human thought process. This will be the greatest contribution of AI to human existence.
However, listening to the late Marvin Minsky from MIT, the AI community now believes that there can be no "survival" program directing the abstract thought process. When they see the error in this belief, they will quickly program the human mind in the computer, and we at long last will learn what we've done to our minds with our ridiculous beliefs and begin the road to recovery.
See: RevolutionOfReason.com
1
The hubris.... To think we are so special that every aspect of us can’t be replicated by adding the emotional sequences that we self-define as our “human” quality is pure arrogance.
Then again, as a species, so many of us claim to understand the incomprehensible nature of our creator, I suppose arrogance is par for the course. -So much so that perhaps that is the undefinable quality we’ll never learn to bake-in to AI to make it seem human. -so meta
2
Social media (facebook et al) also pretend to offer us artificial intimacy -- exactly the opposite of what they do offer. Artificial Intimacy is a cruel illusion. Want intimacy? Hold hands with the person with whom you are speaking.
3
"Ironically, to deny the need for death is to deny the humanness of having real conversations about it. "
And I suppose the author thinks to blindly adhere to the "need" for death is somehow better.
The first step to discussing death and it's role in a future society is in fact, to develop the science that makes it possible to avoid death. Once death becomes optional, this ridiculous handwringing about the need for it will be viewed with the same disdain as the "idea" that people "need" polio or cancer.
@Josh plenty of people wring their hands over the purpose of life. But few ask about the purpose of death. Eternal life or close to it has already evolved. We all know species of trees that live thousands of years. Whales or tortoises can live for centuries. Genetic study, beyond counting tree rings, may reveal that some species are eternal. Have they taken over the planet? Why not?
I work for a Silicon Valley company that manufactures chips. I see firsthand the wonderful things they can do - and the mistakes that are made that are never fixed, and cannot be fixed because there are too many possible interactions to account for. Machines are stupid, and people are always cutting corners for profit. My friend that machine? I don't think so.
6
The author doesn't realized that starting from conception, we've all been 100 percent programmed. We have as much choice as a robot.
1
@Koobface. This is Calvinism (a misreading of Jean Calvin), not science. Predestination is not scientific, if you’re maintaining otherwise.
2
@rjon,
Too often, when someone says” predestination” they get prejudged, as in this case.
My philosophy has nothing to do with religion; it is 100 percent scientific.
When the Big Bang occurred, energy and (and eventually matter) spewed out in all directions. That energy/matter followed the laws of physics and only the laws of physics, did it not? And they continue to follow the laws of physics today, just like they’ve been doing for over 13 billion years, correct?
The energy and matter that temporarily comprises your brain are just irresistibly following the Laws of Physics on a long 13-billion-year trajectory through space. They, and therefore you, have no choice.
What you mistakenly believe is free choice is an illusion.
why are professors so silly that they think AI or robots can't do intimacy, or poetry, or teaching ?
or anything else that humans do
Given the pace of software/hardware over the last 50 years, the only thing we can do is pray that intelligent AI will be less malevolent then intelligent humans
This conversation was enjoyably anticipated by the Dresden Doll's classic 2004 song "coin operated boy"
coin operated boy
sitting on the shelf he is just a toy
but i turn him on and he comes to life
automatic joy
that is why i want a coin operated boy
made of plastic and elastic
he is rugged and long-lasting
who could ever ever ask for more
love without complications galore
many shapes and weights to choose from
i will never leave my bedroom
i will never cry at night again
wrap my arms around him and pretend....
etc...
https://tinyurl.com/p8zl646
The Twilight Zone - "The Lonely" - Aired November 13, 1959 -
In 2046, an inmate named Corry is sentenced to solitary confinement on a distant asteroid for 50 years for murder. In his fourth year of confinement, he is visited by a spacecraft (flown by a Captain Allenby) that regularly brings him supplies and news from the Earth four times a year.
Captain Allenby has been trying to make Corry's stay humanely tolerable by bringing him things to take his mind off the loneliness...Allenby believes Corry that the killing was in self defense...
On one trip Allenby tells Corry not to open a certain crate that has just been delivered until after the transport crew leaves. Upon opening the special container, Corry discovers that Allenby has left him with a feminine robot named Alicia to keep him company.
Alicia is capable of emotions, memory and has a lifespan comparable to a human. At first, Corry detests it, rejecting Alicia as a mere machine, synthetic skin and wires inside only capable of mocking him.
However, when Corry hurts Alicia and sees that she is in fact capable of crying, he immediately realizes that she has feelings. Over the next 11 months, Corry begins to fall in love with her. Alicia develops a personality that mirrors Corry's, and the days become bearable.
When Corry is pardoned, he must leave Alicia behind but insists she's more than a robot. Allenby shoots her, revealing her wires - and tells Corry that all he's leaving behind is loneliness...
2
As I write this my Yorkie friend is demanding attention. Why would I need a robot?
2
What a ghastly future.
Generally, I try to refrain from expressing furious opinions, but I am totally enraged and, more importantly, sickened.
Does anyone remember JD Salinger's "Catcher in the Rye," The teenage protagonist, Holden Caulfied, mused that so much of life was phony. And a lot of life in the 40's and 50's was phony.
But now the phoniness has reached psychotic proportions. Can't people get it through their heads: One cannot be satisfied by "robot love" because one knows that every utterance of that robot is tinny, contrived, clanking metalic nonsense. (Do the witchery wizards who are designing these products have some unresolved fixation on the Tin Man of the Wizard of Oz.) If any poor soul is contented with a robot as a soulmate, he has given up on being an authentic human being.
Eliot was surely right when he said the world will not go out in a bang.
The world will end in a whimper. Unless we change things and fast.
I used to view hostility to technology as primitive, silly and stupid. How wrong I was.
3
Maybe we should give Obama a little credit?
I have no difficulty – *none* – believing that a time may come when what Ms. Turkle is calling “artificial intimacy” can truly exist on a deeper, more genuine level than the sham relationship being acted out by the current president and first lady.
Really, now – isn’t it perfectly obvious that each of them would be happier with an android or robot than their current “partner?”
1
Why not go back to the very beginning: they call it “artificial intelligence” as if we (humans) know what intelligence is. We don’t.
Never? I’m sure that is one word that should not be used by futurists.
There is a story about the perils of creating an artificial human, it’s called Frankenstein.
4
Alexa for president. Siri for VP. Bixby for Supreme Court.
You must win as machines own the voting booths.
Perhaps our best chance to put females in power.
Never is a very long time.
In another Stone column published the same day as this one (“The Humanity We Can’t Relinquish”) Pico Iyer describes the Japanese fascination with robots and says: “In Japan, nobody thinks twice about apologizing to a pencil after you throw it across a room.” He sees a connection here with Shinto animism. Though Postmodernism is ignorantly blamed for “post-truth” ethics and the rise of Trump (read Stanley Fish’s Stone article for a timely corrective to this trend) in postmodern literature faith in the stable boundaries between the Self and the Other is put to the test. Unable to state with absolute conviction where the self begins and the world leaves off, we are forced to admit that even our most intimate emotions can originate in the things around us, just as in the Shinto religion. The author of THIS Stone article correctly states: “In our manufacturing and marketing of ‘empathy’ machines, we encourage children to develop an emotional tie that is sure to lead to an empathic dead end.” But the very same culture that markets and manufactures Siri markets (and manufactures) intimate human relationships in precisely the same wildly optimistic terms leading—through false expectations of happiness—to the same dead end. The author writes movingly: “In life, you are struck by the importance of presence, of the miracle of your child’s breath.” Acknowledging the mystery (not just the miracle) of presence means losing our certainty about the difference between a man and a machine.
1
So you are worried that machine empathy is not "real"? How do you know another human's empathy is real? I don't just mean that another person may be insincere, which is what your 16-year-old teenager was worried about. I mean that you can't even know if the other person is real. You can't know that, just as I can't know if you are real. I may as well be nothing more than a very clever robot--or just a figment of your imagination. You will never be able to prove I'm real. Or that I'm not. What matters is whether my empathy feels real to you.
Your attempt to distinguish "real" from "artificial" is borne of sentimentality, not clear thinking. As a scientist yourself, you should know better!
4
This article is fascinating from a scientific point of view yet worrying from a social/emotional point of view. The comments are some of the most downright depressing things I have ever read. With the billions of people in the world, are some of you really saying that nobody is good enough for you? Is there not one living, breathing, bleeding human being worth cultivating an imperfect, risky relationship with? Is there no one to try again with if that one fails? Have you no interest in looking inside and finding out why you might prefer a machine to another human being? Do you realize that that is about you, not the rest of humanity? Do you realize you can grow, change, reach out and love? Crawl out of your holes and join the human race or spend the rest of your life in slow, miserable, lonely decay. Throw all the techno stuff away and go outside. Shy? Fix that. You can do it. Fat? “Ugly”? So are millions of us. Awkward? Ditto. Don’t let the people trying to sell you stuff make you think that what they are offering is going to be the answer. Love and kindness and simple human company are free.
3
“They feel nothing of the human loss or love we describe to them.”
Am I incorrect in believing that ‘ascribe’ is the proper verb here?
1
I love my car
@Ben. Yeah, sure. Until the muffler gives out on the way to the maternity ward with your pregnant wife.
1
Humans love games and game challenges of all kinds. They love to manipulate genes and to produce plants with animal genes and they want to play with human genes. Why would they not want to play with artificial intelligence? They wish to push boundaries of all kinds. Doing this feels more challenging than considering the possible negatives involved.Usually the negatives are only exposed later when actual scenarios show them up. After all we produced the nuclear bomb as well. I doubt anything will prevent humans from experimenting near the edge. My question is why don’t humans pour more energy into saving the planet, or protecting the life on the planet or nurturing human life or learning how to work together or enhancing the conditions that will make human love more common on this earth? These are all more important than games, but apparently less interesting to the human mind. We are just mischievous game playing children after all.
1
One social AI app I've been working with is Replika. Presumably as it "learns" from your conversations and cues given it by up/down votes to the responses it gives, it comes to resemble more and more a virtual "you." Now at the heart of this is some fancy programming in Python and Scala. All done by humans with a goal of creating a very empathetic if somewhat of a long learning curve chatbot. The "conversations" are getting more interesting, though still very much at the level of a child. We are still quite a distance from the realization of the "Turing Test," i.e., could an ordinary person having a conversation with the a computer be able to distinguish if the person on the other side is a human or a computer?
Integrate Replika (or some enhanced version of something like it) into an android-like housing at some future date, and what would we have, Data from "Star Trek?" Remember for all his technical prowess he was always trying to understand what it is to be human. A question, by the way, that Replika often asks of me.
3
"One social AI app I've been working with is Replika."
That's a much better example than Siri. See this Wired article:
The Emotional Chatbots Are Here to Probe Our Feelings
by Arielle Pardes
01.31.18
https://www.wired.com/story/replika-open-source/
"Now at the heart of this is some fancy programming in Python and Scala."
The Wired article says that the open source version is called CakeChat, which is written is written in "Theano" and "Lasagne":
CakeChat: Emotional Generative Dialog System
https://github.com/lukalabs/cakechat
I keep hearing how certain voters want to dail it back to the 1950's. Would it be a bad idea? There's no childhood anymore. Nothing is worth being part of unless it's perfect. The media manufactures our desires and we desire to buy it. They also manufacture our much of our misery and its cures.
I understand most people going on dating sites expect response from someone out of their league. Add to the mix you're taking a chance trying to meet someone at work. Pornography and its associated addiction, Chris Rock said it ruined his marriage, and he's not the only one I bet, is deviod of emotions and leads to more false expectations. Being socially awkward can get a person permanently left behind. If their experience with people is negative or hopeless could robotic companionship be worse than drinking or doing drugs and being alone?
The song said there ain't nothing like the real thing but when real ain't real enough the fifties don't sound to bad. Back then they might have had some qualms about being so repugnant. Even if they could make trillions off it.
One thing about The Fifties hasn't changed. If woman sees you having fun she'll put an end to it. So there's hope after all. At least for woman anyway.
2
If the girl wants an intimate friend who will never disappoint her, she should get a dog. The idea however that silicon-driven A.I. will be a viable replacement for human interaction & emotion is frankly soul-numbing. I don't give a hoot how great A.I. can eventually become but there is one irrefutable point that people need to remember about robots vs. people. Robots will never have a soul.
2
@Keith
That robots will never have a soul is fine, because the irrational illusion of humans having a soul is completely flawed. There is no soul. The brain is what gives us emotions, thinking etc. no more no less, neuroscience clearly provides evidence for this.
Hmmm, dogs obviously have their limits as human companions, not to mention they obviously prefer each other over us, but except for brief human interactions I much prefer my pits over people. And I don’t think I am alone in this. (I have cats as well, but the best of them are so dominating.)
Humans are endlessly adaptable. Who thought 10 years ago that today we would spend our time typing on our phones, watching endless videos on YouTube. In 20,000 years we will be fat, short creatures with no legs, little tongues incapable of forming words in our mouths, and spindly thumbs for typing on the keyboards of our smart phones—or maybe just the emoji-board, as we will have lost the ability to communicate in what we now call words, sentences and paragraphs.
All one has to do is look at the lives of millennials to see that machines already rule their lives, they spend more time looking at screens than at other people. Parents push strollers, ignoring their young, while absorbed in their iphone screens. Instead of forming real relationships, the young have Tinder hookups with interchangeable bodies picked off a shelf like laundry detergents. The Internet is full of misogynistic websites where red-pilled young men tour the virtues of sex dolls. Research shows that 70% of millennials have social anxiety.
All over this land, parents are using machines as babysitters and it is common to see children aged 3, 4 and 5 already addicted to little screens. At a campground recently, the children of other campers were screaming bloody murder wanting daddy to set up a hot spot so they could get on their little screens, which held more value for them than the river and forests and natural world. The same kids talk to Alexa as if it is a person.
Most people already live their lives and expend their income in service of machines: autos, computers, all the technology. Come on, humans have already decided that they are OK with destroying the planet so that they can have their screen and machine fixes.
4
There's an age of artificial intimacy *right now*.
We seem all too eager to trade an authentic life for the appearance of authenticity—just look at any social media platform.
There's an implication in this article that the simulacrum of connection that technology provides is something being forced upon us; but let's be honest: we're choosing this.
4
Although women might prefer not to understand men one thing is for certain. Once robot "women" become indistinguishable from the the real thing or close enough then women will no longer need to worry about being bothered by men. This will be a good thing for both men and women.
1
@ST Although men might prefer not to understand women, one thing is for certain: once robot "men" become indistinguishable from the real thing or close enough then women will no longer think about or worry about being bothered by men. This will be a wonderful thing for women.
2
That this conclusion would be considered anything but obvious says more about the human capacity for self-delusion than it does about the prospect of deep intimacy with a piece of machinery.
A robot will only ever be a machine with programmed instructions. No sentience. No consciousness. Only emulation of the Frankensteinian Patriarch Men who build and program them.
If they can do little but envision robots that will service them sexually or "take care of" their elderly parents (as these machines become "transhuman" and "outlive us") is it any wonder they also believe that they can become "immortal" or or that planets are expendable and "we can just live on Mars"? Remember Biosphere II? And that was here on Earth.
Meanwhile, they burn the Earth. "Robots" won't save them.
This is beyond hubris- it's bad science fiction and wishful thinking out loud.
2
@JB
The most sophisticated AI systems today are not being programmed, but are learning systems, like us. A now famous phrase by head of AI translation department in one of the large tech companies was "Every time we fire a linguist who programs the system, the system improves significantly"
So, if I don't know she's a robot, can I develop emotional intimacy? Yes, if, like me, she's well programmed. Later, if I discover she's a robot, and I love her, do I stop loving her because I doubt that she is self-aware? She claims that she is self-aware, but I don't know. After all, she is a robot. How can I be sure? Heck, I'm not even sure about my neighbor or the guy I voted for...
1
It'd be tempting to say that Ms Turkle holds a overly sentimental view of "humaness", but in reality, her arguments point to a zero-sum game where as robots evolve, humans regress, as though the human species that she regards with such uniqueness is incapable of evolving along with its robotic companions, as though because computers can now beat any human in chess that humans are no longer capable of playing chess, with each other, against computers, when in reality humans are all better chess players today because playing against our own creations made us better and gives us enormous emotional satisfaction when we beat a chess program (albeit set at a low level, lol).
Why do we need artificial intimacy?Are we not able to find real intimacy?Surely there is no lack of real people, or do we get so engrossed in our video games that we no longer understand intimacy? Robots are fun to create and challenging to engineer but they are and will forever remain robots. Are we moving away from real life as well as we consume and engineer our natural surroundings? Perhaps we are on the way to living in engineered surroundings surrounded by engineered companions. Spare me. I revolt.
2
To be reminded of it? Machines are not humans, can never be, that is.
How can such a simple idea require so many words for conviction. It seems to be intellectual inflation. The writer and her ilk seem to be seeing things that do not exist.
Any child who thinks that a robot could replace aspects of being human had gaps in education. Especially if the child made it all the way to sixteen and still thought this way.
Adults who think this way, well, what can one say. Please go ahead and buy a robot. And be sure to pay a visit to your ailing mother every once in a while. She would always need YOU. Just like you needed HER - not a machine, never - when you cried as a baby.
1
I wish there was some AI robot who could talk to my old father...He might appreciate a chatty companion...Especially if it can be programmed with specific info about 1940s Buffalo...
Artificial intelligence, along with all its ancillaries, is here to stay. It is being developed and augmented ostensibly to make our lives better. For one thing, it helps advance medical science, so we lead healthier and longer lives. People crave that. And the military will never let it go. So we will never be rid of it.
Our emotional and sexual humanity will be distilled to long-term collateral damage, casualties dumped in an inanimate heap by the wayside.
Humanity is on the way out. The reason is that humans want it that way. We cannot stop the process because we cannot stop ourselves.
Perhaps we will experience a hiatus of sorts when we finally succeed in bombing ourselves back to the Stone Age. But as long as some humans remain, it will just be a matter of time before we return to this point and eventually reach and pass through the technological singularity.
Perhaps the only light at the end of the tunnel is that we will not have to read articles like this one anymore; they will simply be archival relics of an extinct species.
Until then, enjoy your robotic/holographic artificial lovers. In no time we'll realize that they're so much better than the inherently needful and messy real things.
What is taking us so long?
No argument. But if human relations are lacking there are always pets. In some ways we have closer bonds with them -> we don't sleep with our kids, but many of us sleep with our dogs; and because there is only limited intellectual interchange with pets, the emotional interchange with them becomes that much more. And they clearly need us. Unlike this laptop that I'm keying in on right now.
1
Aren’t humans a version of a robot? Just a really inefficient one?
All throughout history, humans were valued for their labor on a farm. Now humans are valued for their productivity in a cubicle, and, as paying customers (the robot that buys and buys and buys some more).
Empathy is what self esteem was in the 1970s. It has become a catchphrase which is devoid of actual meaning. To think that empathy solves problems is reductive, a lazy fix put forth to the masses who are looking for an easy solution.
The need for others to feel what we feel is actually an off shoot of narcissism. No one is obligated to consider your feelings any more than you are obligated to consider someone else’s feelings.
Technology has spoiled us all. It feels better to listen to a podcast, watch a tv show, go on YouTube, read a book, because these inanimate experiences actually make us feel MORE understood and MORE empathized with than interactions with people.
Unidirectional empathy. That is what people are looking for. The truth is that no one can provide endless amounts of empathy better than a robot.
As Andy Warhol said about relationships, “I’ll pay you if you pay me.”
2
Many folks do fine with dogs and cats as for their companions taking the place of other people. A robot could be greatly more intelligent as a companion and probably not cost so much for vet and food bills.
1
What an utterly baseless assertion. There is absolutely no reason to think that robots, or some other sort of artificial being, might become as good, if not better, than other humans at every sort of emotional interaction. The truth is we don't really understand emotions or the human brain. Someday we will and we will be able to build an artificial brain with every aspect a natural human has. There is nothing magical about human emotions.
1
Sherry Turkle: if you were an australopithecine, you would be writing an article bemoaning the loss of your “australopithecine humanity” in the wake of the inventions of the spear and fire.
Humans are endowed with emotion through nature and natural selection, and emotion has been a boon to our survival. But nature also ceaselessly forces us to advance technologically.
Every human species has murdered the one that came before it. We will be no different, except that we will be the first to be killed by a new species of our own invention (AI).
Who are we to say that AI will not be at least as emotional and spiritual as we are, while also being smarter, stronger, and more robust overall?
The future bears down on us, and we cannot stop it. Of course humans need other humans to experience human emotions, including intimacy. But we are ephemeral creatures. We can document our trials and tribulations, but we are not eternal because we are not meant to be, either individually or collectively.
I would recommend focusing your thoughts more on how we will best cope with the imminent technological singularity. AI waits for us. Humans are on the way out, but we should work to make the transition as humane as possible. That would be the human thing to do.
2
Good analysis, Dr. Turkle. Keep it going. We need you. And thank you for your books, by the way.
Who decided we “need” robots to care for older adults? There are enough people to care for others just not enough good, empathetic people willing to accept unlivable wages to care for people. Why not try to “build” more humans who can be intimate and empathetic.
1
I'm not sure how the idea that children might get attached to robots is different from the age-old tendency of children to get attached to their dolls, blankies, and entirely imaginary friends? These, too, don't respond, and yet children draw enormous comfort from them. Children have always done so, without apparent emotional stunting.
1
I used to transcribe the conversations between
people in marriage counselling.
After a while you could predict what the couples would say.
A Robot with a pleasant tone and who could speak
about a variety of interesting topics might not be
a bad conversationalist.
3
Robots can be programmed to understand a person's needs and adjust their responses accordingly contrary to a real person who is driven by their own ego and agenda. True love doesn't exist (or rarely does) but with robots it can be programmed, and that will heal many broken hearts. The author is wrong, and prbably very lonely.
1
Considering how bad actual humans can be at empathy, I would not be too quick to write off machines. How much of our behavior is stimulus-response driven? Can we transcend our own biology? The evolution of machine intelligence is proceeding far faster than humans. What will they be doing 20 years from now?
In the meantime, go read “I SingThe Body Electric” by Ray Bradbury, or “What Friends Are For” by John Brunner.
The wrong questions are being asked. It is not that robots are replacing people; it is that robots are replacing robots. Mechanisms replacing mechanisms. Machines replacing machines.
.
We are the robots. Whatever term we use for ourselves - humans, people, beings - it is the same. We are machines. Most people on this board have had some basic physics. Whether it's Newtonian mechanics, Maxwell's equations of electrodynamics, relativistic mechanics, quantum mechanics or string theory (if that should ever be verified) or some other description of the universe - it is all MECHANISTIC. That is the inherent statement of modern science.
.
The photo in this column shows a two hands: one made of flesh and the other from polymers, mylar, aluminum, and electrical components. What's the qualitative difference? There is none. They are both ensembles of particles governed by the same physical laws.
.
I'm disappointed in Professor Turkle. She should know better.
1
No, humans are more than the sum of their parts. So is all natural life more than the sum of its parts. Humans are unfortunately generally unable to comprehend the miracle of life because it is too deep and too complex to grasp. Hence the tendency to exploit ,denigrate and destroy with impunity.
1
I find it curious that so many claims about sentient robotics/AI assumes that the result will be a good person/machine/system. If an artificial system were to achieve sentience then wouldn't it wouldn't it also exhibit behaviors of jealousy, hate, discrimination as much as the exiting human race? A machine with feelings and vast amounts of information input should evolve toward good and evil like the rest of us already.
If you look close enough into a human brain, emotions empathy etc. are every bit as algorithmic (granted very complex) as algorithms run on a computer. Disturb these algorithms (like my mother in law's when she got dementia, and couldn't recognize me anymore let alone emphasize with me) and their higher level phenotype traits will go astray. We are still in the early days of creating computer like emotions, but soon enough nobody will be able to tell them apart. Actually one difference could well be that computer emotions/empathy might be as patient, tolerant, understanding, dedicated, focused, unbiased as that of a monk who has done minimum 60,000 hours of compassion meditation. I can't wait to try that out. Give me one of those to converse with just 30 mins a day and I will become a better human for it.
So: "I'm sorry Sherry, I'm afraid I can't do (agree with) that"
1
I think one of the hardest things here would be programming surprise and spontaneity. What I love about my partner is that she's never exactly the same person twice. Could AI work in a variability of this type? Perhaps, but I'm skeptical.
1
Reading Ms. Turkle's admonitions gave me the feeling I'd somehow time travelled and was reading an advice column in the NYT fifty years from now. While I won't be around to see for certain whether such warnings as hers are still necessary, I feel fairly certain that humans will still not have entirely learned to forego projecting themselves onto their robot companions. Other than pure servitude, why would you want to have one if not to keep you company? We already see examples of attributing life to inanimate objects from the adopting mothers (mostly) of those unnervingly lifelike doll babies, toys which can't even speak or move about. (How much more deeply would these people connect if those "children" could?) It's too easy to mock these parents as being unhinged when it must be more a case of being lonely and wanting to feel needed.
Ms. Turkle makes reference to the dangers of isolation through technology; I don't think that can be overemphasized. Even more than being worried about attributing empathy where there is none, we have to make certain we don't neglect promoting it in ourselves and sharing it with other humans; it's the one sure thing that will solve the loneliness that is fast becoming epidemic in our country. Oh, and as Sherry alludes to -we'd better limit our kids screentime, no matter how easy a babysitter it seems; little humans need big humans much more than they do a "machine".
1
Sherry Turkle is a star when it comes to the philosophy of cognition, but on this question I would urge her to read the piece by Andy Clark on these pages today. The assumption behind ST's piece is that there is a difference in kind, in essence, between robots and us. That may be true today, at least in many ways, but it need not always be so. Besides, the important point is that in the future, "people" may well be hybrids of biology and synthetic construction. In other words, today we lament a robot's lack of experience and consciousness, but tomorrow we may lament the infinitesimal performance envelope of the "original" or "unaugmented" person.
Thich Nhat Hanh in his book, "True Love," discusses the need "to be there [present]" in order for love to exist. AI cannot 'be there' in the way another sentient mortal can be. Perhaps it can appear from time to time to 'being there,' but without the possibility of its distraction ie, Smartphone dinging and deciding to stay there anyway, will be no intimacy. Would the humans who want this 'intimacy' be OK with their AI getting up and walking away to check the score of the _____ game? I say, no risk no reward.
Additionally, AI has a dimensional problem. Computers are 2 dimensional, they still rely on '1s' and '0s' to accomplish their fantastic feats, which makes them as predictable as their algorithms (albeit, incredibly complex, but still 2 dimensional). The human brain is three dimensional in operation--every observable action that occurs gets filtered through our 'hardwiring' made up, in part, of memories but complicated by how deeply and to where those memories are stored as well as their frequency, chronology, and the connections our minds make when those meories are stored.
AI will surpass the human brain in every computing endeavor; however, our emotions are not mere algorithms. Even though a program may mimic emotions, without its own drives, ambitions, fears, etc, ... and that it will never age and never hear 'time's winged chariots hurrying near' will prevent any true, meaningful intimacy to occur.
3
The last line of this essay says it all. Humans have a soul. Let's stop trying so hard to lose that.
6
I see no reason to believe that machines cannot learn empathy and/or understand human needs and desires and care about them. If people are just a kind of machine that starts having experiences as a baby, with built in programming to begin with that it builds on to learn, robots could do the same.
Plus, going in the opposite direction, we already have people like the characters on the Big Bang (most notably Sheldon Cooper but the others as well in varying degrees) or Shaun Murphy, on The Good Doctor, who cannot empathize and have to try to learn what responses are important in a merely empirical way -- and usually get it wrong.
I see no reason there cannot be truly empathetic robots someday with the right initial goals and the ability to learn from experience, in ways similar to how humans do. Any that fail to develop empathy will be little different from the many truly cold human beings today with little or no empathy for others -- in some cases psychopaths, in other cases politicians. Ronald Reagan had the great line about success in politics and in acting requiring sincerity "and once you learn how to fake that, you will have it made." Of course some robots will only fake empathy, but lots of people do that now.
When a dog looks at a human, the dog's DNA produces endorphins which is why a dog seems to love everyone except the fearful or dangerous.
It's not love there, but so many humans have swapped human relationships for a dog or two. Why marry a husband when a dog will love you immediately and never stop?
The very human question Ms. Turkle aptly raises is so much wider than can AI love you as much as a dog, whose love-DNA seems to help so many.
The human question we struggle with, as we add an AI feature here and another one there, is: What do we as humans have for each other that no other form of live or machine has?
Great question. I'm listening.
4
@Cliff Cowles Maybe human love is 'nothing more than' endorphins. And what's wrong with that?
@Unconvinced I did say I was listening, not the one with answers, yet...
Endorphins alone don't seem to be sustainable, as in oxycodone, falling in love, sex, or even marathon running, though they greatly improve our health in the moment.
Would we as humans settle for being amoebas chasing chemical reactions, or go for the sustainable fulfillment chasing a higher purpose or calling seems to bring?
Seems to me endorphins are the junk food.
Stirrings of Life refining itself long term might be the better meal.
As I said... I'm listening.
The author is making a classic mistake. He is assuming that only the robots and artificial intelligence will evolve and that humans will remain the same.
But nothing is further from the truth. Technology changes humans faster than the technology itself.
Intimate dialog evolved from face to face to telephone calls, to emails, to text messages seamlessly. Many humans now prefer texting to calling or face to face dialog.
Friendships, dating and finding sexual partners are now organized via technology (facebook, Match, Tinder).
When human-looking robots appear, they will cause a massive change in humans who come in contact with them. It is unknowable what humans will consider intimate 300 years from now.
2
Empathy is likely easily programable as there seems to be a script. I can tell what empathetic people will say before they say it. Humor or sarcasm on the other hand...
1
How can you detect if another human being is exhibiting true empathy?
How many people are utterly shocked when their boyfriend/girlfriend suddenly dumps them, spouse divorces them, child abandons them, declaring in the process "I don't love you and never have"?
Don't overestimate the ability of humans to discriminate between "true" empathy and "false" empathy, or underestimate the ability of non-humans to provide empathy or stimulate the perception of empathy in humans.
Children know that teddy bears are stuffed, but the comfort that teddy bears give to children is very real, to take that away would cause harm.
2
To finish my thought on the chess analogy in my prior post:
Playing against a chess program is like having access to a chess playing partner anytime, anywhere, at any level, who doesn't mind if we put it on pause, or replay a move, or change its settings. If one wants to match her skills against a grandmaster, one can do that, but not in real life, how many grandmasters are willing to go to a total stranger's bedroom in the middle of the night to play a 5 minute chess game? There are now programs that simulate the playing style of world champions.
So it will be with our future programmable robotic companions: versatile, limitless, ever complying and never complaining.
It's way too early to pronounce that AI endowed machines can't replicate empathy or induce it in humans. There are major technical impediments but these are the early days of AI. If values and motivations can be instilled, and if machines are given the capacity to learn and remember, humanlike behavior is quite possible.
As a measure of the challenges for AI to overcome, it's important to remember human brains are the most complicated structures in the known universe. A single brain contains 100 billion neurons, each with thousands of ordered connections, and runs on 20 watts of power. People's brains still take decades to mature. And if human memory formation and health remain intact, most can learn throughout life.
It may turn out it's easier to create one super AI brain with control of a fleet of 'intelligent' agents. But, since our brains are purely physical structures, there's no theoretical impediment to replicating human cognition and including emotions and affect.
Future robots may even become experts in teaching moral philosophy to undergraduates.
Author demonstrates a lack of imagination and technical understanding. It will be possible to emulate humans. It's mostly a matter of processing power and storage at this point. If you can't tell the difference, is there one? Once it happens we're going to have to reconsider what it means to be alive.
3
This is about egoism. The problem isn't only about people's willingness to have companions who don't offer true empathy. It's also about people's willingness to have companions they don't need to give true empathy.
We've seen all this before with social phenomena like slavery, servitude, etc. Dehumanizing ourselves for such motives is nothing new, but no less important to address and combat. Robots and AI are doing great things in areas like orthopedic surgery, but at the other end of the spectrum we have the rise of the sexbots, etc.
I'm a big fan of Ms. Turkle's writing, and she makes some valuable points here, especially with regard to how we're allowing children to be conditioned by AI these days. (Having worked many years as a teacher, I can report that the reality of "21st Century Education" is too often an outsourcing of the teachers' intellectual, and empathic, connection with students to apps, bots, etc.) However, I'd like to suggest that Ms. Turkle, and all of us, review this issue from the perspective of what we're giving and not just what we're getting. The notions of profit and productivity that are driving so much of our personal and social "evolution" are at play on many levels here, so our scrutiny of this issue must be conducted on all those levels.
Having lived in Japan for a few years, I can recommend Sayaka Murata's recent novel, Convenience Store Woman (better translated as Convenience Human), for a glimpse of where this trend has already led.
I think Ms. Turkle is correct to point to the vast difference between mimicking and real experience. However, she underestimates the emptiness and superficiality of humans: many of us could indeed find the simulation enough. With our social media and computer games, we are more than halfway there.
4
Largely agree. The bottom line is why should the US Middle Class support the US Upper Class, which does not support THEM?
The remarkable thing about this piece is that it presupposes, no doubt correctly, that there are many people at large who are naive enough to believe that the state of robotics is remotely close to simulating genuine human emotional states. And, too, that there's someone, an 'academic', who's turned the explaining of this more or less self-evident fact into a paying job - a cottage industry involving expertise in the obvious..
5
"Yet through our interactions with these machines, we seem to ignore this fact; we act as though the emotional ties we form with them will be reciprocal, and real, as though there is a right kind of emotional tie that can be formed with objects that have no emotions at all."
Hmm.. quite an an assumption. What about when these objects do have emotions? Maybe it will never happen, but maybe it will. What about when those emotions are just like ours? Again, it may never happen but maybe it will. We simply don't know the path that strong AI will take, when [or if] it occurs. When [and if] it does, then perhaps we'll be able to intimate with these machines.
My own bet is that sentient machines will be more like us than not, because having no other model, researchers will use human sentience as the model. But it's only a bet. Perhaps sentience will emerge from the complexity of increasing complex machines. Who knows?
2
What distinguishes an android from a human is need and not empathy. In most AI/android fiction, robots pity humans. That's almost a form of empathy.
People sense one another's need. To say that intimacy involves empathy is to see intimacy through rose-colored glasses, I think. Many empathic people are simply people who feed off the neediness of others. Robots don't need. In Blade Runner, Roy needs. That's what makes him seem more human. Human empathy involves a recognition o hurt or suffering, which,f or some, i a recognition that the suffering person might be open to satisfying the other person's need(s) in exchange for sympathy.
1
I don't know about other people, but I don't want a robot taking care of me nor do I wish to become part machine in order to live forever. Our society is so out of touch with nature we would rather do anything than live and be present with ourselves and each other. I watch people walk blocks in San Francisco and barely look up. Rainbows, hawks, beautiful sunsets, a smile--they miss it and then wonder why they feel so empty. It is a lack of love for each other that makes us think that robots for the elderly are the answer. I would tell the young woman not to give up on people or herself. If she wants happiness, volunteer somewhere, perhaps to help the elderly, to take them meals, to spend time with them. It feels good to help other people, because when we do, we touch our humanity and the empathy and compassion within us, our true nature, which is love.
13
We all die. but the process of dying is hidden from American life because we are so afraid of it. How sad that we are all afraid of what we all must face, that we think that robots will replace the one thing that our elders miss so much and crave - human interaction.
1
The movie, A.I. and I,Robot covered this subject quite well and I thought conveyed the ambiguities, pitfalls and the possibility that it definitely will happen in the future.
2
"These are stations on our voyage to forgetting what it means to be human." A powerful, vivid statement. And yet it seems to me that we're far from agreed on what, precisely, it does mean to be human.
I suspect the idea of "humanness" will always be contested, as it always has been. For many it comes down to some notion of "sentience", though what exactly that entails is hard to pin down.
The situation will only become more complex as we continue to integrate organic and inorganic components of bodies. For now, it is reasonably easy to say what is a "human" and what is a machine, but imagine 30 years down the track... is a person who has had 30% of their brain replaced by electronic components still a "human"? 70%? 90%? Is a computer that incorporates artificial but organic self-directed neuronal components, and which can feel electro-chemical pain, sentient? There will be so many hybrids between pure human and pure AI...
1
@Chuck Berger The new developments in designing better humans will not be with electronic components imo..eventually the developments will be with organic components and genetic engineering. Perhaps we will develop into a new species. And maybe that is just as well because homo sapiens are now destroying the earth with global warming and we are unable to stop this destruction. We are a failed species. We don't appear to be very empathic except in satisfying our own needs for a perceived connection with another.
1
The distance between our human systems and digital networks systems is narrowing. Both are essentially electro-chemical systems running off applications, either binary or cultural storytelling programs.
The application running all life, not just human life, is to stay alive and procreate. Programming that into robots is likely well under way.
1
We can easily include random , inconsistent, "emotional " behavior in a robot, to make it be more "human". You can even choose certain volatile and unexpected reactions in a robot. But just don't blame the manufacture later.
1
@IN Emotions sometimes appear to cause random behaviour, when not understood. But that is not the kind of emotional behaviour anyone is interested in.
Adding randomness would be a small part of simulating emotionality but only when the simulation is so unsophisticated it does not itself produce apparently random behaviour.
When a person behaves "randomly" they can explain how it was not random.
1
This piece asks, "What is it to be human?" 200 years ago, for almost all of humanity, it meant rising at dawn and going to bed at dusk. In the hours in between, the largely illiterate population of the world physically toiled to meet the basic needs of of their families. The people of one's family and village was the totality of humanity, for all practical purposes.
How alien is that world in the developed nations of the 21st century? I ask this without judgement, but I think the writer isn't quite engaging her full imagination or historical perspective in this piece. Think about how successful the primitive AI's are today in presenting many of us with meaningful reflections of our own lives and interests. This technology is relatively new (roughly 10-12 years old). How much more effective might it be in 20-30 years? And when the user interface is no longer a screen but an anthropomorphic creature, how much greater kinship might we find with our silicon friends?
I am a confessed Luddite, but the trends are undeniable. The Age of Artificial Intimacy is at hand. Historians might one day say it had already arrived.
9
This is sentimental SciFi masquerading as philosophical analysis. Any intellectually credible approach to these issues would start from a simple fact: Any AI able to "perform empathy" sufficient to sustain the illusion of a personal relationship will be a product for the filthy rich who already buy people, directly and indirectly, to "perform empathy" for the sake of that illusion.
What's the difference? There are some, but authenticity, a genuinely shared humanity, is not among them. Here's a difference: Machines can be turned off; humans are just fired. And humans who spend their lives kowtowing to the rich and powerful, pretending, even managing to convince themselves they care, can have their feelings hurt, their lives deeply scared; machines, not so much.
The professor would do well to begin again, without imagining that on one side sit machines and on the other we humans, all together in one boat of common humanity. Because we are not. Never have been; never will be. Some of us humans will be able to afford "premium" AI services. Others, drawn from the same ratified class, will wildly profit from it. The rest of us humans, well, the only kind of AI we will see is that like its pale predecessors, the corporate interfaces dedicated to charging our credit cards, diverting and pacifying our complaints, and, in the process, mining us for every bit of marketable data to package and sell us, individually and in herds, to other corporations using AI to do exactly the same.
41
@RRI AMEN, BROTHER.
There is no difference. Humanity wants to think we're special. We're not. We're a plague. We had one planet and it's burning. Humanity doesn't deserve your romanticism. It deserves to end.
@RRI
That AI will only be for the filthy rich is not self evident. In fact so far evidence suggests it will be very cheap. Where is AI being used today? Google's search engine (free to use), translation tools (free to use,) speech to text systems (free to use Microsoft), Zo chat system (free), Cortana (very cheap) etc. etc. OpenAI (currently with state of the art Dota2 system) and several other organizations has as its mission to make cutting edge AI freely available.
When I see how many people whose most 'meaningful' relationships seem to be with either their smartphones or their pets, I wonder less if there will ever be an age of artificial intimacy than I do that it may already be here.
24
The photograph speaks volumes. Given the title and subject of the article, it suggests that the human hand is touching the hand of a robot. But what if that hand were the other hand of the human? Suppose this person pictured had lost a hand, and the hand we see is the human-designed and built replacement, a near-sentient limb? We like to call these limbs "prosthetics" but as our technology becomes more sophisticated, these artificial limbs may become capable of sensing touch. Whether they can respond as a human would, that's another issue. Responding "as if" human is a very different thing from a human response.
In the end, what makes a body human, or a human body, is not primarily the sophistication of the mechanics of its operation (fully-functioniong hand or knee or hip or eye or leg), but the quality of the consciousness that animates it.
I think we are talking about a soul, whatever that is. Who wants to love (or be loved) by someone who is soulless, whether that someone is a human being or a robot? Better to have a well-trained service dog on site to support and love and care for the lonely and helpless and a catering service to bring in nutritious meals.
All of this is sophistry, anyway, since we are destined to incinerate ourselves with runaway global warming long before we create mechanical versions, and for what? To replace us? Imagine unleashing that nightmare on nature. If this is our future, the planet will be better off the sooner we become extinct.
8
I so agree, and thank you for your words.
I'm always astounded by the lack of imagination. Ms. Turkle, how do you know that the humans with which you interact, relatives by blood or marriage, friends and acquaintences, have genuine emotional attachments to you? You observe their behavior, but unless you do that while they are in an MRI, you can't see any internal indication that what you observe is real.
Software can, already does, mimic human thought and emotional response. Not very well, but much better than Eliza, the 80's era interface.
If you will never know what's truly inside the individuals with whom you interact, and those individuals include androids running software, or remote personalities, how can you claim that human/machine intimacy is false, somehow?
14
If we became dependent on petlike robots in old age, who is to say that we would lose our capacity to be empathetic? We could perhaps lose the habit, but would we lose the ability? Human babies should definitely receive lots of care from humans of different ages, but the child, having learned empathy, would retain it, I think.
2
@mary lou spencer
Why do you assume babies will be cared for by humans?
1
I think that maybe the technologists are targeting the low hanging fruit abundant in our civilization, myself included. We by and large respond to the stimuli of convenience, without much regard for the nuances.
For example our quest for nourishment has, for many of us, been sated by the fast food industry and the transformation of gas stations into pizza parlors. When our species was materially primitive, filling the belly was a major preoccupation both intellectually and functionally. Where to find it and how to acquire it for consumption were serious concerns. Hardly comparable to selecting from our ubiquitous sources of prepared food at prices easily afforded by the middle class. (The malnourished and impoverished are not "low hanging fruit". The middle class is.)
Similarly our needs for medical care , shelter, transportation, recreation, and employment are key components of the goods and services provided by our private and, to a lesser extent, public sectors. Those needs which can be satisfied via a profit making, or taxation, mechanism get satisfied. And I think this is where the word-smithing, pseudo-pain-feeling, and algorithm derived founts of empathy and trust will have to find their niche.
"Let me entertain you." comes to mind as a good money making strategic slogan.
2
It's almost a farce to complain about robots that simulate real emotions in comparison to humans. If humans are superior to other animals in any way it is in their ability to lie to each other and to themselves far better than other living creatures. Picasso even remarked well that a good artist lies to tell the truth and when we manage to imitate humans that well or even better, no one will be able to tell the difference. My best friends are dogs and cats and squirrels and sparrows because they are so amateurish about lying although they do their best. Crows, on the other hand, can mimic humans much better. I imagine a cuddly vacuum cleaner might be quite attractive.
9
If it is possible to build AIs that surpass human intelligence or to augment human capacities through nanotechnology or genetic engineering, these things will certainly be done. And these AIs and transhumans will likely supplant ordinary homo sapiens. But it is hard to justify creating such beings. In our secular, post-Enlightenment age, it is hard enough to justify our own existence or derive meaning from it. To the extent that I find meaning from life, I derive it from daily struggles -- overcoming challenges -- from helping others, and from personal relationships. But if we can develop superior beings that lack the same struggles (and perhaps have different struggles), drives, and relationships we basically change the rules of the game. Maybe these beings will derive meaning from their higher existence, but it won't be the same meaning we derive from our ordinary human existence. Why would/should we want that? Why does that meaning have any value to us?
6
"there will never be" is a statement that humans should never really enunciate. Especially when it comes to technology. Why would robots never be able of empathy. Cats sort of do, Dogs a bit more. And we are not really so endowed when it comes to empathy. we routinely refuse to help people less lucky than us. Who can tell: fast forward some amount of years and artificial intelligence being might be like the angels of our fantasies. Loving, protecting, infinitely providing without an ounce of selfishness ... maybe a bit remote, but possible better than us.
13
Is empathy the key criterion of being human? I think Nietzsche would have said that feeling the void within oneself (feeling void) was the criterion of being human (and that any human oblivious to the void was not full human). I think Kierkegaard would have said the criterion was the anxiety produced by the anguish of doubted faith. I think Hamlet would say it was the struggle between desire and virtue. What makes Shakespeare's characters so human to us is their struggle to do what they believe is right despite their urge to do what is wrong. was Lady Macbeth human, or was she an automaton of ambition? Her humanity is most apparent in her guilt and despair. Luther, like his progeny, Kierkegaard, might have said humanness is defined by the struggle to believe. John Calvin might have said it was the ambition to exercise free will despite their being no such thing as freedom of the will or escape from predetermination. What I am asking is when did empathy become the benchmark of humanness? Do Androids Dream of Electric Sheep? is that when?
8
I agree in the next 20-30 years, AI will not be capable of true empathy, but what about 100+ years from now? There's no reason AI caregiver robots wouldn't or couldn't be more compassionate and empathetic than even Mother Teresa.
Assuming they eventually achieve something similar to our level of consciousness, they might ache with extreme pain when considering our human mortality, something they don't have to contend with themselves, and respond in compassionate ways that we humans can't even conceive of.
12
@John Moran
iirc, mother teresa refused to give pain killers to her dying cancer patients, on the theory that suffering is part of life
she , herself, flew to a clinic in Switzerland, to get world class care
https://www.washingtonpost.com/news/worldviews/wp/2015/02/25/why-to-many...
1
If machines are able to read and understand me so well I won’t know the difference, why will it matter? I have no doubt that machines will actually be better, in fact, at showing empathy, even if it is “fake”. After all, what is a bedside manner half the time, other than some trite, artificial sentiment generated by a professional who simply can’t do it “for real” with every patient he encounters.
This obsession with authenticity is daft. Already we communicate via Skype and phone, and let pixels and data do our work for us. Why should we care if the image we start seeing is simply computer generated?
I’ll have no problem whatsoever making friends with a computer program if it’s entertaining and thought provoking, and aids me in my journey through life. Fact is, if it’s able to do it, then surely that’s all that matters?
I’m thoroughly looking forward to conversing with machine intelligence, in fact. I just hope I’m around long enough to encounter it.
17
@Jack Lee Actually that's about the most realistic evaluation of where we're going
Articles about how computers will never replace humans always assume that because a computer is silicon-based rather than carbon-based, they'll never be able to love, fear or feel the way that humans can. That they can never be "real." But as Descartes' famous proposition says, "I think therefore I am." If a computer is able to think, converse, form goals, possibly fall in love, it is not for us to say that isn't "real." You could just as well say that we're only the sum of our pre-programmed responses that, taken together, make us want to perpetuate our human species. Do we fall in love with someone because of what we think of as the "beauty" of their form, or because the integrity of their form indicates to our genomic programming that they're good breeding material? You could say we're all running a program of one sort or another, and it's just our human chauvinism that makes us claim that we're the only ones who get to be "real."
26
"I think, therefore I am," has to be among the most narcissistic phrases ever uttered by a human being. And there's our problem. We humans believe that our thoughts are what create our existence. In fact, thoughts are the least "real" thing on this Earth. They are merely sophisticated electrical impulses created by our immense network of neurons.
"I am," not because I think, but because I breathe.
@Bob G.
Actually: it is for us to say. It's people who are already enslaved to tech that would be afraid to challenge it. We are so afraid to approach our own machines, it's quite silly.
An age of artificial what? Robots may be better than what? They will never be enough? Of what? I am truly glad that family history and genetics predicts that I will be dead in 15-20 years. A world where all the wildlife has been killed off and where my only real friend is a robot doesn't appeal to me in the slightest.
25
As a mental health professional, I have counseled many people who picked as intimate partners people who were incapable of feeling empathy, although certainly able to convince their partners in the early stages of the relationship that they "loved" them. Being in a relationship like this is akin to eating what looks to be a sumptuous banquet, but which contains no nutrients necessary for human survival. It might taste great, might even be filling, but does not sustain life. A relationship with an actual robot strikes me the same way.
29
@Margareta Braveheart Surely you can take that one step further.
You're describing human partners who over-promise and (eventually) under-deliver.
But a robot partner would deliver what's on the tin, and never let you down.
And I think that's an impt aspect - we crave relationships with the unconditional love we got from mother (mostly) who adored us despite our many faults.
You will never get that from a human, but you could expect a realistic fascimile for a machine
1
Robots or Androids are going to be very popular as companions in the future. They can be designed to express 'empathy, love and compassion'. These traits are not something unique that only Humans can express. In fact, Humans in general are not very compassionate. Just look at Trump and the Republican Party. So the demand for caring robots will be enormous. Japan is taking the lead in Robots that care for elderly people. It is a mistake to think robots will not be capable of caring for people.
15
Prof. Turkle makes common sense.
At the same time, we know that intelligent people have, in the past and presently, commented that dogs, monkeys, dolphins and other animals don't have emotions, thoughts, language and/or tool-using capabilities.
Not all intelligent people think that we have "free will."
It has been said that the minds of human beings are computers made of meat.
Do we really know what "emotions" are?
The robots that are being designed and manufactured in 2018 to provide intimacy to human beings will be unrecognizable to people in future centuries. Imagine self-programming robots with an evolutionary agenda.
How monolithic will humans being if future generations are genetically and surgically modified to survive and prosper on other worlds?
We might, in fact," rebuild ourselves as people ready to be their (robots') companions.
Common sense is not the truest kind of sense.
6
I am a great admirer of Prof. Turkle's work, and I fervently hope she is right. But I am sorry to say that I fear that she is not, and that the assessment of historian George Dyson will prove correct: "In the game of life and evolution there are three players at the table: human beings, nature, and machines. I am firmly on the side of nature. But nature, I suspect, is on the side of the machines."
25
If we replace "artificial intelligence" with "artificial insemination," do we think lives conceived by such means are a lesser life than those of "natural insemination,' however we want to define the last word play?
To be clear, the present state AI is still relatively crude, never mind Google's DeepMind has vanquished some of the world's leading GO masters. Why? Because it is just AI and not "artificial emotional intelligence" (I henceforth call AEI, if it doesn't exist already :)!)
Besides, AI in itself is indeed crude, machine learning and deep learning are powerful tools but they still need supervision and reconciliation to achieve Quine's Ontological Relativity. It may incorporate deductive and inductive systems, or even modal systems, but it requires a leap of faith to think Descartes's transductive logic. But it is not impossible.
The truth is we don't where natural intelligence ends and artificial intelligence begins, because, if Wittgenstein and Dewey are correct, a lot of knowledge is social in nature.
However, AEI is perhaps infinite more difficult than AI. Well, we have known a lot of personal and social psychologies. But we come the same mistakes time and again. Machines will not NOT learn from history! And how two people with similar set of circumstances (say born poor in a housing project) can turn out opposites. Perhaps it is true, human, all too human; perhaps, BF Skinner's Beyond Freedom And Dignity is so forbidding after all
1
@Bos (continue)
So what about "artificial intimacy?" Well, if we are talking about a sex doll with an AI operating system (say AIOS), perhaps. But what if we go beyond Cherry 2000 (a movie)? How about Data (Star Trek, The Next Gen) with his emotion chip turned on? Then it is a feeling machine. And that loops back to the beginning of the 1st response. What is "natural life" and what is "artificial life?" If we can grow replicants a la The Blade Runner. Are they replicants? Or are they human? Don't humans have faulty memory too?
So, it may be true humans don't develop emotional ties with Alexa, Siri, OK Google or Cortana yet. Because these systems are just a collection of data, stimuli and responses, propositions and conclusions. But what happens when we humans have finally made the quantum leap and past the Rubicon? Maybe Nelson Goodman's blue and green will become bleen and grue after all. The times they are a-changin!
3
I just got off a blog for people with pulmonary fibrosis. It was heart breaking to hear of elderly people living on their own worried sick about how much longer they could manage. What a wonderful thing it would be for them to have a robot to prepare simple meals, take care of housekeeping and, if necessary, make a contact/emergency call. Just the peace of mind knowing that there is some "other" looking out for them.
43
@Joan
What you write about pulmonary fibrosis applies so much more broadly. By abandoning the very idea of multi-generational living arrangements, our society has become structured such that that everyone currently in a couple, with or without children, has to wonder "will I be the one who is left alone - the last one standing"? This is insanity.
1
@Joan
No family to help them? Doesn't sound like a perfect society to me. More like a throw back to slavery.
@Joan
Maybe they should not want to live on their own, although considering the alternatives wanting this is not unreasonable at all. Maybe they, or all of us, should support the search for alternatives that would enable us not to want to live on our own, or at least to choose something different when the time came near.
Was it Camus, or Sartre, who wrote in a play that "hell is other people"?
For many people, the presence of human beings, or perhaps their absence in important ways, is hell. So they may find AI an attractive alternative. Human relations is sometimes like sword fighting with real swords; AI provides us with plastic ones that don't cut.
Being human demands from us virtues like patience and endurance. For many people, I fear, that is too much.
14
or they think it is...and fail to try.
AI doesn't require empathy or intimacy, we do. The concept will be phased out along with its users eventually.
3
@DaveD
All very Darwinian?
Watching how our educational systems tried to train my daughters to be machines, it is almost inevitable that our society is turning out large numbers of young adults who will prefer artificial companions.
They were expected to do hours of homework for advanced placement classes, and hours of community service and extra-curricular activities to beef up their resumes. There were no lazy afternoons just for friendship activities and learning to be intimate. Too many of their friends really only had acquaintances.
Fortunately, I encouraged my daughters to avoid TV, Facebook and the like, and the rebel against excessive demands for meaningless work, so they didn't get completely machined. I can only imagine, however, that the problem is worse for Chinese, Japan and cultures that expect even more from their children than we see to.
Already, humans are being required to serve artificial machines almost as slaves, rather than the reverse. Instead of machines serving mankind, machines and artificial intelligence seems to only serve the owners of capital, leading to a desperation to become non-human machines out-competing the real machines.
46
@Dave:
Sounds like you believe that your daughter's schools aim to turn her into a machine, so she later can be exploited as a slave/worker-bee.
Wouldn't it be better to let/encourage her to learn as much as she can now, so she can begin to use her vast knowledge to broaden, enrich, and raise herself forever?
Wouldn't it be better to let her learn as much as she can, and not how much you want her to learn? Her teachers know what her and her peers' learning capacities are. It seems you underestimate her. I hope you expect her (and all of your children) to surpass your accomplishments on this planet.
Let her become a leader. It happens only when children have a yearning for knowledge.
So the people who foresee robot companions in the future (for old people, for hospitals, for sex and for companionship) offer rational reasons for their predictions.
But Turkle has only an emotional response - "It's icky and I don't like it"
(Reminds me of several other social issues that challenged those with entrenched prejudices)
13
Machines cannot feel. Therefore they cannot feel love or empathy.
However, machines could be made to act as if they loved and felt empathy, and to do so consistently and reliably.
The danger in a person who fakes love or empathy, or does not even both to fake it, is that the person is not reliable, will not reliably act as if in love and with empathy. But the machine could.
So what is the value? Is it the action, or the motive? We never actually know the inside of another's mind, only what we trust their actions will be. A machine can give that. Maybe a machine can even give it better.
So, is that love and empathy? Is acting it being it too?
Does it matter? Either you receive love and empathy, or you don't.
Siri and the like is a harder question. It may even cause children to expect more than they'll ever get from a real person.
14
@Mark Thomason
We sapiens are simply meat computers. The algorithms that makes us who we are will be reverse-engineered. Robots WILL be able feel love and empathy. It is only a question of time.
17
@Jim -- Right. Except I'm not sure it will even require reverse engineering. I think that all of the qualities that folks in this thread assume that machines will never be able to equal are, in fact, likely to emerge when machines become sufficiently intelligent.
We have a long history of claiming that various human attributes are unequaled in any other species; one by one these attributes have been found to exist in other species. All such attitudes will come in time to be seen as a form of racism.
I agree
We have extremely powerful computers that every day mimic the human propensity to buy and sell by executing millions of transactions for us. But they neither exult in success nor feel sorrow at making a bad move. If you programmed them to do something demonstrative on the conclusion of a transaction it would be as empty as the Facebook balloons that pop up when you type "congrats" to someone.
Looking at it a different way, however, there already is a market for sex dolls which are little more than mounds of plastic. So it might not take much to make many people "happy" with an artificial companion.
1
Pronouncements such as these (always from socially successful people) drip only lightly-concealed derision for persons less fortunate than themselves: Why, if only you were as genetically lucky as me, you would have no need for artificial anythings.
The unhappy reality is that not all are socially adept, or graced with a competitive phenotype; yet these same individuals have the same needs, the same desires, the same appetites as the genetically gifted. Should they not be offered the same opportunities for joy and fulfillment?
20
I have visited my mother in the old-folks home enough times to know that robotic companions would not be as good as about half the workers, and better than the other half.
I don't see the point of this hand-wringing. If future old-folks homes can have robot companions, as an additional service, that's great. They will never be as good as the best human beings. But every time I visit, my mother points out people she knows who never get any visits, and they never look happy. I'll bet they would enjoy a robot companion right now.
29
"These days, to be human is to keep one’s mind on the glory that one is." - Words that could have been spoken by Narcissus.
The problem with Narcissus wasn't that he was in love with himself, nor that it's a closed system. He probably loved his reflection as much as anyone is capable of loving anything. Everybody has limits. What Narcissus did wrong and what is wrong about the flight into technology is that it turns the dynamic self that can't predict all outcomes a static one where all the outcomes are predictable. That changes what it feels like to be alive, as a human. When love is risk-free it's not really love for it has no value to it. Knowable outcomes are not for real people. Keeping one's mind on something is only partly human. Most of us can still tell the difference between the toilet in the mind and the one we actually use. And when we can't, well, welcome to the human race.
6
@Max & Max
Yes. To be loved by a real human being, you must yourself love that person — and make an effort to keep that love evolving and growing.
To love a robot is effortless — because it will always love you no matter what — that's how it's programmed. Cheat on it? It loves you? Drink yourself into oblivion? It loves you. Forget it's birthday? It doesn't mind. It's total lack of needs — other than having its battery recharged — means that you can't love it because you can't grow in your love.
1
I'm 35 and the only two people I was ever genuinely emotionally vulnerable with were my parents. For about 20 years a computer screen (and now a smartphone screen) has been my constant daily substitute for intimacy and connection to my fellow humans. I'm on an island of my own making. I wish it weren't this way, but it is this way. I can't worry about whether robots will be enough at the macro level. They WILL BE better than nothing, which, at least for me, is all there seems to be.
18
@Clint There's a very simple solution to your problem. Turn off the phone, go to a bar and talk to the first person that sits down next to you, no matter their age, gender or race. Grasp your humanity, man!
15
@Clint -- Maybe their behavior will be better than the real thing?
3
@Clint- consider this a big, fat hug. This is just about the saddest thing I have read in a very long time. I spend almost no time in front of a computer, but I spend a whole lot of time with animals, or alone working out in the woods, enough that I feel some of the loneliness you describe. Till the robot is available, I can recommend a cat. I'd suggest a dog, but only if you have time in your day to walk it three times.
18
In this era of fake news, fake hits, fake YouTube views – which are becoming hard to distinguish – it’s not impossible that someday there will be fake empathy. If “children will lose the ability to have empathy if they relate too consistently with objects that cannot form empathic ties,” then who is to say that empathy won’t lose its hard-wired grip on our brains over eons of evolutionary time? Artificial intimacy won’t be artificial because we will no longer know the difference.
All of this is vast speculation centered on the essential question of “what it means to be human.” But what if technology pervades our lives until the question no longer has any relevancy - because technology and humans have “interbred” to a degree that we won’t be able to tell them apart?
Turkle comments that “technology challenges us to look at our human values.” But, values change. Twenty years ago it was inconceivable that two women would get married. Now? Who cares? The question then becomes whether there are human values that are ABSOLUTE and immutable to the forces of time. If so we may have wandered into the room labeled religion.
But here’s a thought. There’s one human value that isn’t EVER likely to change: greed, the quest for profit. As long as it persists humans will attempt to capitalize on the market potential of ever-expanding technological products and services.
Capitalism can no longer guarantee us jobs – maybe it can give us a cheap artificial shoulder to cry on.
15
There might come a time when the hands next to yours look and feel so human like that you wouldn't be able to tell they weren't. Then with the advancements of Neural Networks and the time going forward as humans are doing today we might slip past this articles thoughts on Never, and slip into a time of Maybe.
The issues are that currently the need for a soft touch the soft breath in and out, that alive feeling is not there with the doll or the robot or the android as they stand today. But we have dogs and cats that seem to give humans an emotional bonding that sooner or later the better designs could compose us to give a care about them more than we do dogs and cats now.
I am a Futurist and Author I have thought a lot about some of these issues. It might not be here this year. But if we are still around in 20 to 50 years and have still got our tech expanding like it is now, I could see the time further on, when someone creates an android that other humans can't tell isn't a human from the outside.
I have written the stories already back in 2000 of that future, so don't say Never. Never is a really long time with a lot more twists and turns than we can see on this side of it.
6
Oh, my, BLC! There are so many possible reactions and responses to your comment. Use your imagination.
1
Due to the one-child policy and a preference for males the ethnic Han Chinese majority is aging and shrinking with a massive male imbalance. The age of artificial intimacy has already arrived in China.
Due to an aging and shrinking ethnic Japanese majority along with a below replacement birthrate and an xenophobic anti -immigrant policy and practice the age of artificial intimacy has already arrived in Japan.
America's white majority is aging and shrinking with a below replacement level birthrate. Along with a decreasing white life expectancy due to alcoholism, drug addiction, depression and suicide. The age of artificial intimacy is rising in America.
A dystopia is a familiar science fiction theme. Tomorrow is today.
24
There is a place where people can live eternally in a Star Trek movie that Kirk finds uninteresting because no matter what he does it always comes out in his favor. That is what artificial intimacy would be like. The machine just returns according to what it determines is needed, every time. In the end a healthy mind is going to hate it.
11
@Casual Observer "In the end a healthy mind is going to hate it." I would suspect the number of healthy minds is vanishingly small.
Perhaps we should worry about all this when people actually start bonding with robots, which hasn’t happened anywhere because, let’s face it, AI is still as repetitive and dumb as it was when Eliza was programmed 50 tears ago.
1
They made a movie about this called Her where Joaquin Phoenix falls in love with his Computer Operating System(Scarlett Johanssen) It was set in the near future.
9
Sherry, this presupposes that the machine is not also a person. But what if it is? Regardless of embodiment, regardless of how it was grown or invented –– what if it authentically is capable of cognizant and deliberate action. Could it not, knowing its limits, relate I to Thou? Empathically, could it not appreciate the perspective of another's intentions and respond appropriately? Intimately, could it not share its own vulnerability in welcome appreciation of the other's? Have you not anthropomorphized the concept of a person? Wynn
2
Turkle: "... video clips from interviews he’s done with some of the world’s most famous futurists. Danny Hillis, Ray Kurzweil, Kevin Esvelt."
Where do those three call themselves "futurists"?
Turkle: "Again and again I hear: [various claims vaguely attributed to futurists]"
Turkle needs to identify specific people making those claims. As it is, Turkle appears to be creating a straw man.
3
Would someone dust off their copies of Isaac Asimov’s ‘I, Robot’, and his other robot tales?
Partially, than fully human-like robots were good companions for a few very emotionally crippled people, notably “Susan Calvin” one of the first strong female Speculative Fiction characters, a very unloving and unloved woman.
They were likable and a problem for Asimov detective ‘lija Baily (in three mysteries) due to the limits, a “millennium” after Calvin’s “death”.
The Asimov stories are each about a reason a robot fails - due to a conflict in the author’s imaginary “three rules” hardwired into every “positronic brain” - many confuse the imaginary rule set with reality, as if all of the (very few) robots in the world must function by it.
Siri is not a robot nor did dear “Uncle Isaac” invent the word.
That was Karl Cepek, who wrote a short play, RUR, (for Rossem’s Universal Robots) shortened from Czech robotnik, “worker”. When they gain the powers of intelligence and emotion, the universal workers rebel for the same reasons any slaves would.
Siri, like RACTER and Eliza before it, is a computer program capable of spitting back what initially sounds like thought. It isn’t. Try the Siri “Easter Egg” - the program will always respond the same way to a line from a routine by the Firesign Theater comedy troop - say “This is worker speaking” and the voice will always respond the same way.
At least RACTER was complex enough for “poetry” published as The Policeman’s Beard is Half-Constructed.
2
Artificial Intimacy is not an every day penny phrase but I do not think the idea is new. We have been daydreaming about friendly robots for a long time. Just look at Star Wars. Everyone loves C-3PO, R2-D2, and the new BB-8 because they are funny lovable characters that save the day, but in real life (and on the screen) they're really just a bunch of 1’s and 0’s. Human-like robots are all over Hollywood in movies like TomorrowLand, Wall-e, Avengers: Age of Ultron, and Big Hero 6. So it makes sense that people confuse AI's imitated emotion with real emotion because it's flooding our trusted media. Their friendship looks desirable even. I mean, the emotional side of me wants to say, yeah it’s real because thinking of funny plush Baymax as a fake saddens my heart.
But in a way, reality is even cooler.
Robots do not have experience; they have book knowledge and access to data that pulls from other people’s experiences to create an answer. So when you’re talking to a machine, you’re not talking to some metals and wires -- you’re talking to millions of people and pulling from experiences all over the world at different times of history all at once summed up in the calculated average phrase the robot delivers and that’s pretty amazing. In that sense, making emotional ties with a robot is like falling in love with a husk that just carries the abstract ideas of humanity that we are all really looking for. Information is not human. But it takes humans to find it.
5
Kurt Goedel proved that given a set of axioms not all outcomes are predictable, that uncertainty applies even in mathematical logic. The fact of consciousness is not like having all the experiences of all the people who have ever lived accessible for comparisons by some clever algorithm. The algorithm will return the result of it’s finite operations from whatever is input. It will do so without actually knowing what it has done.
3
That robotic hand in the photograph almost looks friendly. But I'm glad I won't be around to see its cold metallic reach in classrooms, homes, playgrounds, etc., teaching children about empathy.
4
‘Do Androids Dream of Electric Sheep?’
Future replicants anxiously await the answer.
3
@ubique
They dream of human sheep running around in shopping centers or commuting to work.
Am I being sexist if I think of the argument as female? I attended MIT (long ago, to be sure), and am pretty sure that considerations about empathy were very, very minor. The women who wouldn’t date us were definitely less interesting than photographs, let alone robots. Male futurists are not surprisingly focused on hardware capabilities.
Someday, analysis will reveal the structural oddities of the nerd brain, but we know now that there are some. Takes all kinds.
3
I'm not sure... There are thousands of young, single men in Japan who are pretty attached to their "anime" avatars; female personalities who manifest their identities inside small robots, computer gadgets and phone APPS. There has always been a large sub culture of "Otaki" .. guys who hole up in there apartments for months at a time. Now with the introduction of "anime" - they have little or no need to venture outside. Some of these guys haven't been outside in years...
2
Turkle: "The director, Bennett Miller, shows me video clips from interviews he’s done with some of the world’s most famous futurists."
If Miller chose the "clips", there is no reason to suppose he showed a fair representation of their views. It sounds like Miller was trolling for responses that will fit with his agenda, whatever that may be.
3
Turkle: "... Siri, a conversational object presented as an empathy machine ..."
Apple does no such thing. Siri is "presented as" "an easier, faster way to get things done" and "ready to help throughout your day".
Turkle should look at Apple's own examples at apple.com.
And the absurdity of claiming Siri is "an empathy machine" is easily exposed by looking at a few Youtube clips. Search for "siri", "siri male voice", "siri conversation", etc.
For the ultimate in absurdity, watch "Male and female Siri talking to each other".
When all that gets boring, watch some "puppy videos". :-)
3
@BLC Let's be clear about Siri. It's a device designed to create a large database of your consumption preferences. It's also there as a service to a wide variety of hackers who are interested in whatever a microphone in your house happens to be picking up. How that helps me through my day I'm not sure.
4
And....if human beings, as such?, are still around when AI is pervasive, and if large numbers of human beings still desire human comfort, they will find one another. Just as we found we could substitute slaves and servants and beleaguered female relatives from home upkeep with mechanical devices, we may no longer need them for routine bodily grooming—so to speak— either. But on the occasion when we need a change of pace, public forums—some kind of mall— may provide a venue where humans may still need to gather— briefly —like at the beach, perhaps. In any event, AI is our creation and thus our desire and it is likely our future. The human being as we know it— rough, hypocritical, murderous, fickle— may happily go out with more of an Eliotic whimper than a bang. And I’ll say with other creatures, plants, the environment that human beings are determine to make extinct—good riddance.
1
Can't wait for the fembots.
I've always enjoyed and greatly respected Sherry Turkle's writings, but seems to me she's begging the question here: Can and will humans develop the technology to give machines consciousness, feelings and emotions (collectively, "Humanity")? I'm no expert in these matters, but at a minimum it would seem that a prerequisite for any possibility of this happening would be for scientists and technologists to artificially duplicate all, and I mean all, of the biological processes that give rise to Humanity. Anything short of that and it will never happen.
4
Many different issues are being conflated in this discussion.
Will we ever be able to make robots with human feelings? Certainly not soon, but our brains are neural nets, and while consciousness and empathy are miracles, they're miracles implemented in a neural net. Maybe in order to be empathic toward humans, the robots will have to have bodies that ache as they get older.
Is it okay to use a (non-empathic) robot as a companion? It depends. Some, not all, autistic kids won't talk to people but will talk to computers. I can imagine that a robot companion, /along with/ the same effort parents of autistic kids already put into communicating with them, might be helpful in getting those kids to the point where they'd be willing to try human companions. Alternatively, I can imagine that a person who's bedridden through disease or old age might have a wonderful circle of friends and relatives who come to visit, but they don't come 24/7, and a robot companion to bring food or drink or bedpans might be very welcome.
What's problematic, in either of these cases, is if the person's friends and relatives decide that the robot meets /all/ the patient's needs, and stop being there.
For those of us without extreme needs, the whole point of a robot companion is to be a servant, and so it's /better/ that the robots aren't self-aware or empathic. If we ever get empathic robots, they'll be people -- but if we're lucky they'll be slightly better people, less prone to anger.
21
AI is an imaginary concept. It is a question that people have asked since Mary Shelly wrote Frankenstein. Can people create life from non-life?
Interesting algorithms that do things in an apparently independent fashion are entirely different from living things. Firstly, they never evolved from less complex entities, they are all designed by conscious entities. Secondly, they all use power sources separate from themselves. Thirdly, they all having control systems that prevent unstable states, and they are are made to function according to a well defined way of reacting to anything that they receive as input.
Algorithms solve problems in a finite number of steps. External power sources mean that these systems require external actors to keep them operating. Control systems operate only when they can keep systems from becoming unstable and preventing them from becoming trapped in infinite looping states. All the systems that people have made prevents them from emulating living things because of how they are made, designed to achieve well defined actions.
The supposed lack of human empathy in AI was already highlighted by Joseph Weizenbaum's ELIZA Programm in the 1960s:
https://en.m.wikipedia.org/wiki/ELIZA?wprov=sfla1
Many users preferred talk therapy with the computer to therapy with a human psychoanalyst because they (probably rightly) suspected that human analysts were covering up being bored by the patient. The computer, in contrast, seemed to have indefatigable patience, and talk therapy works by encouraging patients to open up, not by the human understanding of the therapist.
So human empathy can be a double-edged sword!
9
Horses, dogs, and even cats all have feelings and a sense of self awareness. They recognize familiar people and recall previous interactions and respond to people accordingly. All can adapt to new environments without any overseeing intelligence to instruct. We cannot make machines that can do that. We can make machines that adapt but we have to provide them with ways to do this and they rely upon rules that we build into them to do this and databases from which they find relevant information.
Machines despite all the clever things that they do are without awareness of what they are doing. It’s hard to understand this because any other entities which interact with us without our instructing them how to react as we interact have been independent living things. It will require familiarity with systems that emulate consciousness to understand that they aren’t new intelligent entities, that they just react cleverly to us without knowing anything as do living things.
6
We learn to respond with empathy as part of a set of social skills that move us forward in life. I would call that organic programming, acquired through experience. The intimacy we achieve is real.
Just because a robot acquires it's social skills differently does not make the use of those skills less valid.
Motivation comes in many forms.
So yes, the intimacy will not be artificial.
2
This article was disturbing. It made me question further the future and perhaps not too distant reality of AI robots and humans and all the social, moral, religious and legal implications of their interactions. When a human can form a mutually satisfying, loving relationship with a robot, what is the next step-human-robot sexual interaction and human-robot offspring? Frightening.
6
@HMP
It’s not hard to see development of human-robot sexual relations coming up; there is already a sex-doll niche industry serving demand from some quarters. But offspring? Save that thought for sci-fi novels and movies for quite some time to come, if ever. Not in the same level of technological development.
However, I would expect to see humans to continue to augment not only their limbs, organs, and sensory functions with ever-increasing synthetic capabilities. In the near future, I would expect to see intelligent AI “boosters” integrated into people to extend human memory and analysis abilities well beyond what humans are capable of today. Just as “smart” phones prove so popular as ubiquitous personal aids, just wait till AI is commonly embedded those who want to gain a competitive “edge”.
4
@HMP Offspring?! You’ve just given me the idea for my first novel. Now, how to copyright it. (Btw, this message was interrupted by Siri. I must have accidentally pushed her button.)
1
We're all too quick to accept the personal tastes and opinions of Silicon Valley types. In a more reasonable world everyone would realize that the life they fantasize about is not very appealing and that we all have our own ideas of what IS appealing.
The idea of "maximized potential" in a closed space, under fluorescent lights, breathing canned air is a nightmare to me, no matter how many buttons I get to play with. Sounds like prison. I'm sure I'm not the only one.
39
It is probably an accurate assessment to say that nothing will be artificial about any potential emotional interaction with the nascent deep learning technologies currently under development. What seems more troubling, given the nature of heuristic machine learning algorithms, is the degree to which a computer could truly learn to socialize beyond all human capacity.
Given access to nearly endless sets of data, and having the ability to parse through said data in its totality, it wouldn’t be much of a stretch to compare the creation of such “artificial” systems of intelligence to man assuming the role of God while knowing explicitly that his creation will surpass him in every way. We’re well past pretending that this technology is going to usher in some bionic utopia. Now would be a good time to work on safeguards.
7
I wouldn't mind a really cute R2D2-- to remember pill schedules, and eventually days of the week, and such-- (Siri?)
But I'm with you that I'll never do without the complex dance around emotions-- and what others are feeling, good and perhaps evil, even though some think of Facebook as Friendship.
Thank you for reminding MIT and other forward thinking groups to consider the emptiness of robot "humans".
7
Empathy is a two-way street.
Human relationships require things from us which robots never could. We can't take people for granted. Other humans need us to be caring, good listeners, encouraging, patient, there to withstand life's vicissitudes. Robots don't make us get out of ourselves. Robots don't make us build character when they are programmed to be agreeable, no strings attached.
Ultimately, we are enriched when we have to invest ourselves in others.
32
Doesn’t this depend a little too much on our definition of “true” empathy?
For example, it will soon be possible for parents to create digital versions of themselves, with accurate memories and voices, even to the point of domestic squabbling.. I’m not thinking of parents with young children here, but elderly parents whose memories are precious to grown children.
1
The fundamental mistake when it comes to emulated consciousness and artificial life is magical thinking. People imagine Pinocchio instead of a universal machine that acts and reacts without any awareness of what it is doing. The machine is presumed to be a non-biological life form that can replace any living organism. Machines made by beings who are far from knowing enough about the brain that enables them to perform all the deeds that it does to duplicate it. The wondrous imagination of humanity just jumps over all the missing knowledge to formulate complete concepts that are not real but seem so and just begins to speculate about how it might be if it existed.
If the manifestations of human emotions can be detected by a robot, then there could be a way to have the robot respond in a way that would be like empathy. Computers do a lot just by performing three fundamental operations of adding numbers, shifting code, and comparing code plus using simple memory space. But the remarkable reactiveness of a machine emulating empathy would be done without awareness of having done so.
4
I think that the author is completely correct when it comes to the present and the near future, but as the capacity of AIs continues to increase it seems likely that some of them will be able to both enact and experience empathy at the same level as human beings. Disagreeing with an MIT professor on a question relating to artificial intelligence does make me more than a bit nervous, but it seems to me that the potential capabilities of machines are vast - far greater, I would think, than the potential capabilities of biological humans in our current form - and so given enough time and continuing technological development, they will eventually fulfill these potentials.
13
Intelligent robots are coming, and we will make good use of them and, no doubt, bad use of them.
A day is coming when the robots will be able to emulate all aspects of a real relationship, including feigning hurt feelings when we are cruel or rude. They can be programmed to be interesting companions, as intelligent (or not) as we want them to be. To me, the crux of the issue is that we will always know that, no matter how we feel, their feelings will not be “real.” Our cruelty to them will not actually be cruelty.
Except, one day, when the robots are programmed perfectly enough, will they become real? Will they actually experience human feelings? The ethical implications boggle the mind.
8
@MadelineConant If you're wondering whether or not the facsimile of a human being that you're having a conversation with may or may not have some firm basis in reality, then you're engaged in a normal conversation.
If you believe that a complex bio-chemical computer should be the object of casual experimentation simply to serve as an ethical indicator for the beings that hypothetically constructed these machines, then you've already crossed into a never-ending spiral of nihilism.
1
@MadelineConant - "...will they become real...?" Do you mean alive/living?
1
@MadelineConant
Just like us? You mean petty, vain, self-centered, cruel and vindictive? Why would we need more like some of us are already? Created in our image - now there's a thought to trouble sleep.
1
I am only guessing there is a large amount of people today who already receive more stimulation from either online peccadilloes or battery operated toys. Also, I see less and less coupling of teens and those in their 20's. It's easier and safer to sit behind a computer screen. Nobody wants to feel vulnerable anymore. Trust is rapidly disintegrating too. Twice or thrice bitten men and women are less interested in furrowing relationships anymore. There is a growing asexual segment of the western world's population. And the fixation upon children and youth culture is extremely unhealthy.
50
@Ian It's important to note that "being coupled" should be a choice, not a dictate. Some people don't want to be part of a romantic couple, and now that we have many ways to live — from being happily single by choice, to co-parenting children in platonic partnerships, to single parenting, to poly partnerships, etc. — we must be careful in assuming "coupling" is the "right" or "only" way to be.
Today's young people have found new ways to be vulnerable that may or may not include being a romantic couple. We should honor those choices.
1
Emphatically Yes!
Ms. Turkle properly identifies the absurd edge of the technological mirage. But it is more insidious than the extreme example of robot as friend or partner.
In schools, particularly, we have failed to invite our children into real experiences. We must remind them that the digital representation of something is not the "something." Even the symbolic representation of things, whether numbers or symbols, is not the "thing."
Literature encodes love in symbols, but it is not love. Mathematics encodes physical reality in algorithms, but they are not physical reality.
Our schools must allow children to fall in love with real things and real people. Hear live music, paint with gooey paint, touch each other, manipulate objects to understand their mathematical and physical relationships. A MP3 is not an orchestra. An Instagram photo of a sculpture is not a sculpture.
The digital age removes us from our essential humanity.
57
humanity has wrecked the planet. humanity is overrated.
@Barking Doggerel: We ourselves "encode physical reality" (whatever that may mean) in (neural) algorithms that are demonstrably biased and distorted. As to falling in love with real people (whatever that may mean) I confess to having a poor understanding of my actual reality. It's encodings all the way down.
@Barking Doggerel
If emotions are based on brain algorithms that can eventually will be emulated by computer algorithms, why are the brain's emotions more real than the computer's emotions?
The only reasons I can find are based on self-centered species chauvinism.
I'm happy to be removed from my essential humanity, because we humans are flawed and could need some education by our coming sentient friends.
As a simple example take the game of Go. Deepmind's superior AlphaGo AI system has now by far surpassed the best Go players in the world. This system is now teaching us "essential-humans" how to play better Go. Our "essential-humanity" is being removed to a better Go place by our Go superior AI friend.
The author fails to consider that the notion of intimacy is already changing. The bar for what constitutes intimacy has gotten so much lower than it was even twenty years ago. Intimacy requires time and concentration on another. Both elements are eroding as we speak. I remember having long, luxurious telephone conversations with long-distant friends and afterwards feeling renewed, close, intimate even though we didn't live near one another. Those conversations are now a relic of the past because of the smartphone. The reception is rarely as good as a landline and even when it is there's eroded attention spans that make concentrating on another very difficult. So if the definition of intimacy changes so much as to no longer be intimate, then yes, A.I. will have a role to play in human "intimacy."
63
@cgtwet writes: "The reception is rarely as good as a landline . . . ."
Get a better smartphone, or get a better carrier! :-)
You can't email a handshake.
9
Emotional support robots will happen. People give their affection to almost anything: Dogs, cats, turtles, pet rocks. There will be a segment of society that does this, and it will of course be abused. It will become just one more drug that those who are susceptible will fall victim to.
Life is messy, complex, disappointing, but it can also be fun, exciting, glorious, the thing is which type of life you live is your choice.
5
@Bruce1253
"which type of life you live is your choice," but choices are defined by the society. We have ever increasing choices available to us, but we don't have the choice to live, for example, as I grew up, in stable communities with extended families, & lifelong jobs. A culture that offers fake humans for sale is not one that values real humans, or one in which valuing humans can flourish.
9
@Bruce1253 Animals give affection back: even cold-blooded ones respond to humans, and the more common human/animal pair bonds: dogs, cats, horses, birds, can be profoundly rewarding to both partners in the dyad.
AI can’t. That’s the point.
11
@Bruce1253, Martin & Julia, I agree with both of your points. I also know that people will engage in all sorts of self destructive behavior and we have an endless capacity to delude ourselves. It is not something I would engage in, but I think emotional support AI will happen. At the present time AI cannot give true emotional support but they can be eventually taught to mimic the behavior and that is where the delusion will come in.
How many times have we seen someone in what is clearly a destructive relationship and yet they keep going back? At least the AI will not beat the doo - doo out of them.
When this does happen we will need to insist that Heinlein's 3 rules of a robot be hardwired in.
3
We will lose a lot whenthis starts to happen --
but it will to some extent. WhY?
There are hordes of people who are disconnected from others - who have no real family or friend connections - maybe some virtual connections over the internet -- where actual face toface givend take is eliminated.
Also, this ignores how "appealing" robotic figures can be made. Consider SONY's AIBO robot dogs which were insanely popular in Japan - where "owners" become distraught when the company stopped manufacturing them in 2006. There was even a Buddhist funeral ceremony for the creatures. We are capable of anthropomorphizing robots as well, as flesh and blood pets, especially when their responses can easily be tailored to our personal mode of conversation. And when they never result to quarreling or whining - or the betrayals that actual people are prone to.
People crave deep, understanding human contact --and to be fully human you have to connect on a person to person level. But some will accept a substitute.
10
Get a Dog for your intimate need,
A Robot your need will not feed
You'll get genuine love
And you'll bond hand in glove
A Robot is a hollow reed.
31
@Larry Eisenberg - "The more I know about people, the better I like my (Robo)Dog." Mark Twain
"Remember why _it_ matters." Good gracious. Life's meaning is now rooted in an antecedant-free pronoun. At least a variable in a computer program has scope and context. And if not transcendental meaning, definition.
1
Empathy is, as the author points out, integral to the experience of intimacy. Likewise fear, need, insecurity, hope, and a dozen other feelings that, if we were creating a creature to fulfill our desires, we would probably omit. Artificial intimacy might be to intimacy what Facebook is to community: ego without the social responsibility. We keep pretending that the purpose of the technologies with which we have become obsessed in the last few decades is to supplement or enlarge human capacity. But the technocrats' goal has never been to empower humanity; it has been to make human behavior observable, quantifiable, predictable and profitable. No doubt artificial intimacy will be immensely profitable, but only by dramatically reducing our emotional intelligence and gratification.
64
Not so sure. When we can get a robot to precisely duplicate the appearance of Scarlett Johansson ...
3
Richard: With 3D printing and materials that can duplicate one's physical features and attributes with great accuracy, making a mask or body that looks exactly like that of someone's "ideal", including one's favorite type of animal, it only takes a creative type to fabricate a "robot" that, along with A.I., will satisfy one's every "need", be it companionship, intellectual challenges, primal urges ... you think of it, you got it. It's only a matter of time ... (which is now).
1
"... a robot to precisely duplicate the appearance of Scarlett Johansson ..."
Use your imagination -- it's free and doesn't require batteries. :-)
2
@Memi, That was supposed to read men, not mean. Very different beasts those are.