Deepfakes — Believe at Your Own Risk

Nov 22, 2019 · 158 comments
Robb Lovell (Vancouver, Canada)
This is not an ethical thing to do. It’s on par with inventing the atomic bomb. Given very real human vulnerabilities to emotional content, cult based brain washing and tribal belief despite cognitive dissonance, this is not something we should do as a society, as engineers or as scientists. It’s just wrong on so many levels.
Wajahat Ali (Islamabad)
We are already made fools by our politicians and now this can wreak hevoc.
Mark A. Newell (Mendocino, Calif.)
So now the truth can be whatever you can dream up? –To be reported, as: truth?– Yes, it's true. Now, you don't have to be a Russian kleptocrat in order throw an election. Used to be, that a deepfake could be easily pulled off by poor Russian writers who spoke English as a second language. And look where we are now...The election is the only thing to save this democracy, and look: all we see is danger, precisely this danger of bad agents and their new tools of deception. On major platforms. Proper legislation would help, if only those institutions of democracy were up and running, and not under attack...My suggestion would be to have a law that requires a side panel on every political ad, regardless of media type. (lets just start there). That panel would offer some kind of easily familiar plausibility account, a credibility rating, something, anything to remind the reader–that information is only as good as the source. That what you are reading might be fake. Wouldn't that be great? Laws that require the integrity of reporting be rated for its truthful content. Laws that require that sources be rated for reliability. Laws to enforce oversight. A law that says: Foreign agents are outlaws if they print ANYTHING political in Facebook, ever. It is a shame, but we do need laws to protect the truth; and, we, the people, and the legislators that represent us, better get cracking at creating those laws...quickly. This story is of a Pandora's box that has already been opened.
Meg (Minneapolis)
Nearly as disturbing as the 20-something’s apparent glee at developing these videos was the NYT reporter’s total absence of making these guys answer WHY they are doing this and have they ever thought about the ramifications of this. The closest Barstow came to asking, the CEO just rambled on about seeing a life coach. No follow up questions. Barstow’s face just looked like a kid in a candy store the whole report- at the very least, be neutral and ask the glaringly obvious questions.
Lon Newman (Park Falls, WI)
There is a repeated statement in this enlightening story, that these are "idealistic " young people. I didn't see any idealism on any level at all. From the CEO through the engineers, they seemed to act as techno geniuses and moral idiots. Asked any question about consideration of ethics, consequences, morality, or control, their humanoid screens went blank. "We need to be first! Woo hoo!
Robert (Cooper City FL)
Nihilistic fakers looking to cash out early. It's a cowardly new world we've entered with chuckling techno-clowns who apparently know nothing other than a fat payday.
Hɛktər (Τροίας)
Remember those old cartoons of the big buff guy kicking sand into the face of the skinny dweeb? Well, instead of the skinny dweeb getting buff as well, he became a coder. And those guys, they seem all nice and affable... but they are not. There are some deep psychological issue there that are playing out in sociopathic ways. This is just one of those examples.
m. k. jaks (toronto)
"Are we ready for this?" Come on. You know we're not. Our democratic institutions can't move fast enough so technology will swallow us whole and we'll be all in a dither while foreign powers take over Western democracy.
fgros (NY)
There must be a mechanism to track the individual or collective enterprise(s) that use this technology, and a regulatory scheme that has authority to define and punish use of the technology for nefarious purposes. I don't discount the difficulty of imposing restrictions on the use of this technology. We have a substantial portion of our population that supports a political party that has been stripped of moral conscience and ethical standards and will do anything to retain and wield authority. Witness this party's reluctance to acknowledge the disinformation campaign operated by Russia.
Alec (Kingston)
The problem with deep fakes is that discovering them is going to get harder and harder, by design. Right now the best thing for creating images and video that don’t exist are generative adversarial networks (GANs), which are trained by pitting the generating network against another that detects fakes, and they iterate until they are both so incredibly good at their job that it is hard or impossible for a human to tell whether the result is generated or not. So the problem of distinguishing a deep fake is baked into the method of creating one.
William Perrigo (Germany (U.S. Citizen))
This brings a new era to our door step where mandatory video authentication should be legislated. It would be like the Bitcoin of video! Freedom the Press would be linked to mandatory origin of data provided.
Darren (PA)
It's all fun and games until it happens to these developers, or until it destroys the things they care about. I agree with the commenter who suggested that it would be better if they worked on technology to detect and de-weaponize deep fakes. Instead, they aim to perfect this destructive tool.
scott (california)
I don't understand why we don't make it a requirement for all political speech videos to have a digital signature that can be checked to authenticate its author. Why is video different?
W (Minneapolis, MN)
@scott The only real digital signature that exists today is Copyright Registration. You deposit a copy of the video at the U.S. Copyright Office, and receive a receipt called a Certificate of Registration. Then if someone doctors your video, you can sue them in civil court for copyright infringement. According to the methods described in this video, the Deepfake could not be copyrighted by anyone. That's because a software program cannot produce an original works, and is thus not statutory subject matter under the Copyright Act. A software program can only do one thing, what it's been told to do. It doesn't matter if the software program is A.I. The only thing A.I. means is that the software has been written through repetitive training exercises. It merely means that it's written in a different way from other programs. But it is incapable of originality.
Mark (Mexico)
Don’t need it. Already, you can watch someone on tv actually say before your very eyes that yes, there was a quid pro quo, get over it, and then deny he said that the very next day with a straight face — and people will believe the denial. George Orwell’s 1984 looks almost benign compared with the current state of affairs.
Richard Mays (Queens NY)
This is sick, depraved, and has no benevolent real world applications. Obviously, these young engineers are oblivious to the evil they are facilitating. They are being exploited by an oligarchy that wants to advance the agenda of “1984.” Not only can’t you fool “Mother Nature”, you shouldn’t. Combine this technology with sentient A.I. and the reason for humans, going forward, has just been made obsolete. “They know not what they do.” Now we won’t either. Sad and horrifying.
sam (ngai)
use your talent to make evil, or fight it , take your pick. mind you that it will bite you later.
Ames (NYC)
I guess the humiliating fake porns of famous women, including AOC, produced to destroy them, is not a serious-enough problem? Young men, consider your children, other people, our democracy, as you open this door, laughing and telling yourselves you're doing something essential to save the world.
GariRae (California)
The issue is the sociopathic personalities of those people who create deep fakes....or those Americans who profited from producing in 2015 -2016 the conspiracies that they knew would resonate with the Right. These people make LOTS of money with no compunction concerning the damage to America and Americans caused by their lucrative lies.
Mo (Toronto)
Crazy.. the same company just published this: https://www.youtube.com/watch?v=i7QNUZWS6VE A combination of audio + video deepfake. The future will be wild.
Cliff R. Loriot, PhD (Winston-Salem, NC)
I see in this the potential to fake the assassination of a political figure and then have them come back to life.
Val (California)
In 2016 there were YouTube videos of Hillary Clinton having seizures while speaking to people. The technology is already in use.
Virginia (CA)
This dovetails with the “fake news” accusations in a very disturbing way. I hope the major networks take this seriously. It would be far too easy for the Devin Nunes “CNN lies” story to end with him seeming to be vindicated by a deepfake.
Stephen Pearcy (Aiken, SC)
Won't this give cover to Trump who could claim that every video, especially the ones with lies, are deep fakes. You know he will.
Frank (NC)
It's terrifying that false words can be put into anyone's mouth. The engineers will be very proud once this has been perfected. However, even more terrifying is the technology already perfected by Trump: spewing lies and hate, then declaring it fake news reported by liberal media to misrepresent his love of America. Pandora's box was opened long ago.
Steve (Western Massachusetts)
Why bother with elaborate faked videos? Our President tells demonstrably false lies every day and a fair number of people believe them.
Polaris (New York)
In the time when the Shakespeare plays were written, there was the term “gull” for anyone who could be easily fooled. There are a lot of gulls ready to believe anything in America. That’s how Trump got elected, and Republicans in general.
Susan (Vermont)
Creates a lot of horrible problems; solves none. Why are "good people" working on this?
Mac (Toronto)
@Susan I see where you are coming from. However, there are going to be people (good or bad) working on this regardless. Aren't you glad it is this team, sharing it with the world and building counter-technologies, rather than bad actors who are going to start sharing undisclosed false videos?
Steve Fankuchen (Oakland, CA)
In reality, people like to kid themselves that the bad guys are not just as smart as the good guys and that the bad guys are not often more motivated. Couple that with Silicon Valley's mantra, "disruption equals progress" (the marketing snake oil that sold us Donald Trump as President), and we have the perfect storm of a cover for the huge dangers of the internet-connected society. "Fake believe" at least needs people to fall for the hustlers, the lies, the "fake news", and the "alternative facts." However, the next steps, the armed drones, do not even need your cooperation to make you a victim. Nor will guns made by 3-D printers with off-the-shelf- internet patterns. Lies and publicized private truths will, before long, become the least of people's problems with the internet.
Chris (SW PA)
There is no need to go to such lengths. The vast majority of humans are already under someone else's control through a kind of cult manipulation method.
Mick (los angeles)
This is the worst – of many terrible ideas – to come from tech. The project behind this video is so disturbing and will cause so much damage to what's left of our society, but equally disturbing is seeing a bunch of 20 year old engineers giggling and cooing at their frankenstein creations like little children. These projects should be overseen by some kind of ethics supervisors that actually consider the deep moral, ethical, and societal implications of their efforts. We already can't handle social media, which pales in comparison to the danger of deep fake technology. Of course I'm aware deep fakes are already out there on the fringes but with the sanctioned acceleration of this by big tech may be the final straw that breaks all belief in objective reality. Great job boys!
m. k. jaks (toronto)
@Mick Totally agree with you, but China's engineers could easily reproduce the same thing and pump it at the unsuspecting American public. We've allowed the tech boys to take control of democracy. How are we going to get it back? Our democracy doesn't have the institutions or the structures and, conveniently, Reagan's mantra of "big government is bad" has played into the hands of big business and big technologies big time -- to the peril of our Western world.
Jon (Is)
It's a bunch of 20 something engineers, probably not that smart, tweaking the structure and parameters of a deep model using a toolkit they probably didn't write. What they are doing is not very deep despite the name. Plenty of freely available toolkits are out there that are easy to tweak, so if they don't do it someone else will, and even if they do do it, someone else is.
Erich Hayner (Oakland, CA)
Thank you, from all of us who can't afford yet another pay for play resource that we consider to be vital and important content. We're not asking for free Game Of Thrones you know. Isn't paying over $300 a year to the NYT enough to have this program and included? I very much resent the plethora of streaming content that currently costs us more and more each season. I would watch commercials in order to see this, but I'm guessing that that's not profitable enough.
Virginia (CA)
@Erich Hayner Were you intending to send this to the customer service inbox?
Joseph Finsterwald (Cambridge, MA)
The solution to this problem is actually pretty simple. When a video is released is can be cryptographically hashed. The value of the hash could be embedded in metadata that is digitally included in the video file. To confirm the authenticity of a file would be a simple operation—simply rehash the video and compare it to the metadata hash. This method is used to prevent exactly the same thing in open sourced software. It’s also used for message authentication control to prevent a third party from intercepting and message in transit and changing its content. Simple.
mary (connecticut)
Human beings are ill-equipped for this new technological revolution that created a Digital Society. Clearly, we have over-estimated our society’s moral, social, and emotional intelligence. We lack the much-needed skill called Critical Thinking; "A critical thinker is able to identify the main contention in an issue, look for evidence that supports or opposes that contention, and assess the strength of the reasoning, while a thinker may base their belief solely on faith or personal opinion." Hence most are easily lead and this has morphed into a time that our democracy is in peril. The only resolution is an educated population. It is all too evident that those few who hold power over the many do not want nor never will advocate for.
RamS (New York)
There's no doubt that one end point of digital technology is the complete inabily to distinguish what is real from what is not. This is not some kind of a bug but rather a feature in our evolution. Human perception of reality (and I would say organismal perception of their environment) has always been selective - choosing what we see and what we don't and then evolution literally playing judge, jury, and executioner. The use of mind altering substances reveals how much of our mind is programmed due to cultural and developmental biases. With the ability to alter reality (our environment) to help us comes this extreme - imagine the potential competitive advantages for those able to work with an effective reality (i.e., a perception of reality that works for them). Don't worry, we'll either sort ourselves out or we won't survive the upcoming bottleneck (aka the Great Filter).
Davey Boy (NJ)
The dilemma is that technology evolves rapidly, but the morals involved in how to use new technology don’t evolve at all. In the 1980s, people could never have imagined today’s technology. Yet, the ancient Greek philosophers would feel perfectly comfortable discussing morals and ethics if they could somehow be transported in time 2500 years to the present. Mankind’s inability to evolve its moral thinking is the fatal flaw . . .
Connie L (Chicago)
While there are so many ways for this technology to do damage, maybe people need a reason not to rush to judgment based on a random video or audio byte. Even 'real' video only tells part of a story. It's easy enough to use normally edited video and audio segments to sway an audience one way or another. For that matter, that's true of the written word, too. The difference is, you need to work to digest printed text, but video is processed in an instant. Don't get me wrong - this is scary - just thinking about a possible silver lining.
Jayne De Sesa (Paris)
The seven social sins include: — Knowledge without character. — Commerce without morality. — Science without humanity. Which one best fits deep fake operations?
Freestyler (Highland Park, NJ)
Hand in hand with whatever technology will or will not be able to do to undermine our democracy is the ongoing war waged by the political right on all forms of public education....especially against any form of liberal arts education.
Matt (America)
Why do you insist on advertising for these shows so many days in advance? Is there incentive to build anticipation? I can tell you that it's very frustrating to click on it and not have it available. Save your resources and let us know about these things when they are actual available.
Topher S (St. Louis, MO)
Just wait until countries experience violent uprisings and masses of people are manipulated on a national level by these deep fakes. Less educated and less sophisticated areas are especially susceptible but not exclusively. Our intelligence and technology will be our undoing. Well, that and our pride as a species and delusional confidence in our individual ability to judge reality. We may be clever apes, but we're still apes with a form of ape brain. A brain susceptible to bias, influence, and primal reaction. Unfortunately so many refuse to accept that, instead believing we are creatures separate from our closest animal relatives and with a special form of agency. Creatures who too often put faith in their "gut" and think their flavor of magic being made them special and guides them.
Austin Ouellette (Denver, CO)
AI engineers are the dumbest of all extremely smart people. Recently leaked documents from China show that they used highly sophisticated artificial intelligence systems to identify millions of targets for internment camps. Like the plot from Captain America: Winter Soldier with concentration camps instead of heli-carriers. We need less artificial intelligence, and more human thoughtfulness.
scott (california)
@Austin Ouellette if you pay attention, engineers have been warning and finding ways to stop this for at least the last decade. But the public never seems to "get it" until they experience it.
Mac Nicol (Toronto, ON)
Hello, I work at Dessa, the company featured in this edition of The Weekly. The team is not taking this technology lightly, they invested time into building a deepfake detector, producing record-breaking results. You can check it out here: https://www.dessa.com/post/deepfake-detection-that-actually-works
Jena (NC)
@Mac Nicol Watching this episode right now and apparently you are not listening to what people are complaining about right now - in a world which is becoming very illiberal you are complicating the side of democracy not authoritarians.
Sad in Missouri (Chesterfield, MO)
@Mac Nicol do reputable news agencies, Facebook, Youtube, Fox “News” etc use your tool for vetting video’s? If not, your tool is for all intents and purpose useless. What is your update cycle, if it is like most current software it will need daily updates because it is never really ready for prime time. Newer deep-fakes will bypass your outdated tool. Here is a scenario for you, a fake video is make of some despot claiming the time has come to right historic wrongs. The video is widely broadcast. The leader of the targeted nation asks his tech savvy 21 year son to use your tool to determine whether the video is real. The son who has no tech knowledge other than texting and playing video games does not know how to use the tool but nonetheless responds yes, the video is real. Emotions take over, war ensues and thousands of people die. Your tool has no control over ignorance or emotion. If your only product was the detection tool, you would have credibility. The fact that you all appeared on TV joking, smirking and laughing about deep-fakes tells a very different story.
Kevin (Broomall Pa)
I think this is not great why post a training video for lies?
Afp (Cleveland)
Maybe block chain could be used to authenticate the source of a material.
Kevin Banker (Red Bank, NJ)
I love how dog seems surprised by the actor's fake announcement. Maybe we can train dogs to detect deep fake videos.
Pray for Help (Connect to the Light)
What do Russian disinformation campaigns look like, and how can we protect our elections? [Brookings] --Russian goal, using information warfare, create a society that that confuse fact and fiction. --Using digital; bots, trolls, micro-targeting disinformation. --Old Strategies w/new, digital tools are. -- More disinformation; fake websites working together as a network; fake personalities using Twitter, Facebook, Google, YouTube etc. We’re not paying enough attention to algorithmic manipulation. --More frightening use of artificial intelligence to enhance the tools of political warfare where AI driven attacks harder to detect. --AI driven disinformation will better target specific audiences and will predict/manipulate human responses; Soon, we won’t be able to tell the difference between automated vs human entities. --Convincing real deep fake video/audio appears convincingly real being used to mislead/deceive us. Debunking this content will be like playing whack-a-mole. --We can inoculate the US against political warfare, disinformation, and cyber-attacks. --Step one; develop a strategy to deter political warfare --Currently we don’t have a strategy, because we dissolved our capabilities we had during the Cold War. --Step two be more critical consumers of information and recognizing that the information we consume is not neutral but often manipulated by malicious actors. As citizens, we have a responsibility to be more discerning and aware.
RamS (New York)
@Pray for Help Yeah, but there is a relatively simple technological solution which I'm sure will be ignored by the powers that be and also misused by end users, like (weak) passwords. Deep fakes require digitisation - of course, we can take something digital and pipe it into an analog signal but the answer is routine encryption of source and end user levels and working only with trusted sources. For instances, let's say I say the NY Times is a trusted source then we use public key encryption to ensure that everything I watch comes from NY Times. Someone can't make a deep fake and say it comes from NY Times without the encryption. It may lead to balkanisation depending on how we trust sources but it'll root out the deep fakes problem. It could create a market for information brokers (i.e., glorified editors). But if we were savvy about how we used encryption, we could solve the deep fake problem as well as the general social media problem of lack of editorial content. Public key encryption will ensure that the end user is anonymous/not tracked. But if the media companies choose to make it non-public key or if users don't learn to trust sources with discernment, it'll be like the situation we have with passwords. No way around the fact that education is the answer to a lot of the world's problems.
Steve Fankuchen (Oakland, CA)
Why the surprise? Oh, I forgot. People like to kid themselves that the bad guys are not just as smart as the good guys and not often more motivated. And Silicon Valley's mantra, "disruption equals progress", marketing snake oil that sold us Trump as President, serves as a cover for the huge dangers of the internet-connected society. "Fake believe" at least needs people to fall for that stuff. Armed drones, on the other hand, do not even need your cooperation to make you a victim. Nor will guns made by 3-D printers with off-the-shelf- internet patterns.
T. Rivers (Thong Lo, Krungteph)
Every video, every statement needs to be digitally signed with a private key to PROVE who created it, that they attest to the people purported to be in it, and to the veracity of statements therein. That video or ad cannot be aired or published until the key is verified. If there were concsequences for people lying day in and day out, we wouldn’t be suffering fools like Trump, Nunes, Jordan, Meadows, McConnel, Pompeo and the like.
VJ (Los Angeles)
Sad to see human intelligence being wasted on the technology development of deep fake technology. I feel that it is crossing the boundry of human ethics. It can only do us more harm than good for generations to come. I wish those engineers can invest their precious resources for the betterment of this plant.
Markus (Jasper, WY)
"AI" is a misnomer. It is a hot buzzword these days, but nonetheless "artificial intelligence" does not exist. Highly complex computer programming (by humans) perhaps, but "AI", nope.
StuartM (-)
It's a well worn old chestnut by now for sure, but as the man said: they were so preoccupied with whether or not they could, they didn't stop to think if they should. We're doomed. One way or another, from the sky, from a spark, from the sea, from our screens; and we'll even be able to say we told ourselves so.
José R Nevrón (Scarsdale, NY)
This is scary, and they think it is funny. These people only care about how much they can fool others, without thinking of the consequences. This video reminds me of the Cambridge Analytica Brexit/Trump scandal. These young people remind me of Brittany Kayser. She only cared about proving how smart she was, as well.
Matt (Mexico)
@José R Nevrón It is clearly an awareness piece. Here is their work on detectors: https://www.dessa.com/post/deepfake-detection-that-actually-works And referenced in the NY times today: https://www.nytimes.com/2019/11/24/technology/tech-companies-deepfakes.html
Robb Lovell (Vancouver, Canada)
@Matt Working on both detection and production doesn't absolve you. Unethical behavior matched with ethical behavior doesn't make it ethical. A right does not correct a wrong. You still need to avoid doing wrong.
Ronn (Seoul)
If the trust placed in the institutions of government and news is destroyed, then there can be no democracy, there can only be space for the autocrat who can "fix things" and lead with their self-imposed version of reality. This "deepfake" is Cassandra singing in a coal mine like a canary. This will only make recent mis-events pale in comparison. The only question is when and who will do this what can be done to neutralize this.
Ed (Colorado)
"Do they risk introducing a tool that can forever be used to cloud the truth?" We've already got plenty of tools that are used to cloud the truth. One is called writing. Another is called speaking. And three that amplify the first two are called Twitter, Facebook, and YouTube.
James (CA)
We need an authentication banner on all video that purports to be news and all deep fakes should be required to provide a link in the banner to it's source before being allowed to broadcast on any platform.
Peter (CT)
@James Hey, I’ve got an idea: fake authentication banners!
E. Smith (NYC)
"You can fool all of the people some of the time and some of the people all of the time, but you can't fool all of the people all of the time." Hopefully, these words are still true.
TimMcG (Virginia)
It is disturbing to me to see this bunch of tech-savvy youngsters giggling about something with such a seriously high potential for abuse.
Jet Phillips (Northern California)
A bunch of guys. It’s always guys. Guess what, there’s a whole world of us that really don’t care what the men are doing.
Topher S (St. Louis, MO)
And you don't think that women in similar positions can't or aren't doing something similar? Are your still hanging on to the myth that with women in control abuses don't happen?
Jordan (Minneapolis)
More likely is than a million people being fooled by the video, Deepfake's existence will be used to sow doubt in reality. Politicians will say something insane and then say it was a fake.
Eric (Texas)
Deep fakes are not the real problem. Technology can solve the problem of identifying a deep fake with the development of media capture devices which encrypt and authenticate. Media will become 'authenticated' or 'not authenticated'. The real problem is people wanting to believe something which they know or should suspect is not true. Trump has told thousands of lies and yet millions either don't care he is lying or 'believe' he is telling the truth. The pizzagate conspiracy theory that Hillary was running a human trafficking and child sex ring from the basement of a pizza parlor was utterly unbelievable and yet..
Topher S (St. Louis, MO)
Look at India, where rumors -- words in a Facebook post -- have reputedly led to riots and murder. Now imagine deepfakes working at a national level, especially in less sophisticated areas of the globe. Imagine the disruption and chaos they could unleash, first on a national level then globally. That's not to say places like the US aren't ripe for such manipulation. As David Brooks pointed out on PBS, there's a huge segment of the population who agree Trump did the things of which he accused, but they still support him because he's sticking it to people they hate. Many don't hide that they're itching to engage in open conflict. Deepfakes are a perfect excuse.
RamS (New York)
@Topher S Huh? You're saying significant portions of the populace (I'm talking more than 1%) are interested in open physical warfare with their family, friends, and neighbours? I'm cynical about the US but I doubt it. Don't buy the media narrative. They're selling eyeballs. Talk to people who have opposite views. Ask them if they're willing to pick up a gun to kill you, attack you physically, etc. etc. But if it does happen, it'll be red states vs. blue states. It won't be on an individual level any more than it is already.
Robb Lovell (Vancouver, Canada)
@RamS You left out players that are actively trying to disrupt civil structures of democratic government. Sowing dissent to destroy democracy is a goal in of itself for mafia, cult, criminal and authoritarian governments that profit off of disorganization of the structures that work against them. Yes, the media is competing for eyeballs to get money directly, but others are competing for eyeballs to create the environment by which they can make money and control people. War, profiteering and disruptions are the point.
Peter Aretin (Boulder, Colorado)
You need to be a magician to know how the tricks are done.
Darrie (Nyc)
Instead of using their brains on fake videos, why dont they focus their energy on more meaning solutions for problems in the world! Dont AI engineergs get to do a class on social responsibilities, and do some community service during their college? Might divert their attention from wasting time on meaningless fake videos!
stevevelo (Milwaukee, WI)
Ummmmm... “undermine your faith in what you see and hear”. Sorry to be the bearer of bad tidings, but that happened a looooong time ago.
Carl (Philadelphia)
You can avoid have a problem with a deep fake by NOT getting your news from Facebook!
CathyK (Oregon)
Nothing is safe and nothing is real which is why I want to be paid for any advertisement I am forced to see or any snippets of articles that pop onto my internet feed.
the doctor (allentown, pa)
Watching these bright kids laugh and giggle over the perfection of a technology that would enable bad actors and sundry extremists to create the reality of their choice on video was depressing. It was if these kids were gushing about scoring a date with a hot girl - a total absence of introspection. Scary.
John Swift (Portland. Oregon)
If I were a politician, I would simply control all videos of myself by posting the original, verified by me, on my approved website. That way, any faked video surfacing can be compared to my original copy. Sure fakes will appear, but mine will be the real footage of the event. The same idea as when a politician runs an ad and says I approved this ad. Lies and fakes will always be around, like cockroaches.
Virginia (CA)
@John Swift The hypothetical politician could take the video and doctor it and post That as the “real” one. Then if anyone had seen the speech and raised an objection, or had unmodified video, they would seem like the fakers. If the politician is the one who wants two, three or more faces, the game goes up a level. It’s no longer about “sympathetic” versus “unsympathetic” news outlets at that point. We got use to video being “real.” Oh well.
Susan Anderson (Boston)
Have a quick look at this video, which shows how easy these fakes are. Takes no time and trouble, it's all laid out for you. https://www.youtube.com/watch?v=C8AxAvh3-ck [I hesitated to mention this is Samantha Bee: I hope if you don't like her you'll look anyway. It's a staggering demonstration!]
Logan (Ohio)
This is really not so new. I created a deep fake nearly five years ago. Alas, no one was convinced. "The Prez vs. Screecher" https://www.youtube.com/watch?v=HmIyKSm5hg4
PHR (Williamsburg, VA)
And so, why is Andrew Yang all but ignored by the media?
A (Reader)
Because he suggests we get in an arms race on AI with China. He literally said that in the last debate. He wants our Defense to hire huge groups of people like these men, making sure they understand and develop every possible way to deceive and 1984 the world, but in doing so, still unleash all the things we can’t stand, just be the first to invent it. It’s completely tone deaf: we FIRST need a national conversation on regulating all of this. We haven’t as a nation passed much of anything on bioethics - cloning for example. The American public wants regulation and national ethics conversations BEFORE running into FUNDING it so we can take it to the nth degree more quickly - it’s not what we want first, we want the special task force at the presidential level that figures out where we stand as a nation and passes policy regulation aligned with our ethics as a nation. We ten years behind Europe on that.
Tom J (Berwyn, IL)
Once again a group of educated, cocky young white men play around with technology that will eventually make life more miserable for most of the planet, but they will be rich.
KR (Rochester NY)
It’s always males who seem to check their morals and stewardship for humanity at the door, in seek of doing something new.
Gil (Toronto)
I subscribe to the NYT. Why can't I watch the video here?
Marc (Colorado)
The NY Times is incredulously avoiding the responsible outrage and social indictment due these nefarious digital rodents. Their product is designed to do one thing and one thing only: Deliberately commit fraud. This manufacturer is designing a product to commit and assist other in the commission of crime. And they and blatantly advertising it as such. I would cheerfully see these grinning jackals locked away in remote silence. They are no less guilty of the resulting conspiracy, fraud or treason than the dealer with a pocket full of heroin who hasn't made his first sale of the day is guilty of possession with intent to distribute ... than a flea market gun hawker shouting "murder tools for sale" is complicit in the resulting homicide. The overall threat to civil society that all anonymous media has become is an issue I've been waiting for the NY Times to engage.
Michael (Brooklyn)
They should work on how the public and the media can detect and tag as false an ultrarealistic fake video.
Economist (Boulder, CO)
Video chutzpah and politics. What a combo. Throw in a little sex, mix it up with some drugs and we’re right back in Hollywood where it all started.
Susan Orlins (Washington DC)
Mainstream media must be on the alert.
Moe (Def)
The sheep already believe most anything Fake News publishes, and politicians tell them thru pearly white, lying, orthodonticly corrected teeth with a charismatic smile. This technology will make the book “Animal Farm” come true with a vengeance! Or just stop watching TV and only use the internet for banking...If that?
dove (kingston n.j.)
If we're reading about "deepfakes", it's already too late. Some enterprising mind, however, will invent a deepfake detector. Whatever it is that makes the deepfake concept work will have something about it that can be used to expose the fake. And so on...........and so on.
Miles Coltrane (New York City)
First we need to better educate people to be critical thinkers. I’m not naïve, This is not likely the solution. we also then need a validation /attribution process for media companies and platforms. Ie: If a video does not have a verified check mark then let the viewer beware. Advertising, editorial, commentary, opinion content all should have a clear attribution to their originating source. This needs to be as easy to read as a nutrition label
Susan Anderson (Boston)
@Miles Coltrane These deep fakes are proof against fact checking. They absolutely look real.
papageno (washington DC)
don't they ever stop and think "wow maybe this is a bad idea"?
Anne Penglase (New York)
The terror of technology. Could reality extinction wipe out humanity before a physical extinction? 😨
Kevin Brock (Waynesville, NC)
And I remember the deepfake April Fool’s stories on NPR. Like the one about Starbucks planning a coffee pipeline from Seattle to the east coast.
Adams Wofford (Durham, NC)
It must be outlawed. We will believe fakes and we will come to disbelieve things filmed in reality that the subject finds embarrassing. People will not know what to believe. Democracy would not function. There are those who desire this outcome.
Dr. Sam (Dallas Texas)
In a democracy of lemmings - this will be a game changer for the all media. The people will need to verify their own facts and stand for their own beliefs for a change...
Barry (Stone Mountain)
Extremely frightening. Attempts to optimize deepfake detectors can help for a while. However, since all videos are now digital, it is just a matter of time before the fakes are not just indistinguishable from the real, but are identical to real. Digitally identical. Except for the fact that what is portrayed never really happened. I hope this scares you as much as it scares me. We will all be unplugging sometime ahead.
T Mo (Florida)
Truth will become more elusive and therefor that much more valuable.
Dan (Stowe, VT)
The Democrats will look at deepfakes as a something that needs to be regulated and stopped. The Republicans will see deepfakes as leverage to confuse and manipulate voters to their will. I have zero doubt that republicans are already investing in this for the 2020 presidential election. And the democrats will cry outrage and unfairness while millions of low information voters will already have been duped.
LarleyF (Colorado)
Forward motion is inevitable. I find it understandable that these young men are driven to create. it’s quite unsettling however that they seem oblivious to the villainous side of things. Not a mention. They actually don’t seem mature enough to understand consequences. Sigh hold on to your hat.
Aurora (Vermont)
What does this technology bring to the table that's positive? Because every possible use I can think of for this technology is negative. You could literally frame people for a felony. It's bad enough that we have a president who has created, with the help of other Republicans and Fox news, a fake reality in America. Now drop this technology into that mix and what will we render? I'm not even going to say it.
Aurora (Vermont)
@D - That's not a positive. Especially in light of Russian interference in our 2016 election.
poslug (Cambridge)
Given how many are taken in by blatant fakes (Nigerian princes with free money or a grandson needing bucks in the EU), you have to wonder why the effort unless it is courting nefarious big money (not Nigerian princes in this case but still very bad). It would be helpful if the telephone carriers did more to block those endless fake calls many of which are not the hard to block international ones.
McGloin (Brooklyn)
The Party of Trump says, "truth is not truth," truth is whatever we say it is," loves President Pathological Liar Every morning they wake up and learn the new lies from Fox News and repeat them all day. They like being in on the lies and in on the corruption. They don't even need "deep fakes" to "believe" what they want to believe. They believe that Hillary had a cold sex time in a pizaria. Technology like this in the hands of people that think truth is merely an illusion to be manipulated, led by a president who regularly calls for political violence against critics and news organizations, without mention of due process, is truly terrifying. Democrats believe in reason, logic, math, and science. Democrats entertain the opinions of the Right and check them against the numbers. They wait until they have definitive evidence before pushing their views, and even then, many refuse to even say the obvious. Medicare for All for example ha incredible amounts of evidence from a dozen countries that get better care for half the cost. The evidence says that M4A costs a little more than corporate insurance in n the first decade and half as much in every other decade. The Right health plan is end ACA and replace it with some secret plan that doesn't actually exist. The Right base is not going to change. The Left base is not going to change. Moderates have to pick a side. I would suggest the side that believes in truth, logic, science, peace, and investing in America.
Joseph Wildey (New Hampshire)
Why are there no women featured in this video? This lack of diversity may explain why this technology, to date, has often been used to create videos featuring women in nonconsensual adult film scenarios.
VW (Paris, France)
Only men in this video...all so dedicated to "create humans" (non biological ones). A coincidence? I don't think so.
Moondance (NYT)
A world where one cannot tell fake from fiction? Without the currency of truth we are lost as a species. Science is being challenged not in exploration of new information that leads to new truths. It is being exploited to undermine our existence. Climate change is fake because religious fanatics believe the world comes with an expiration date. Let’s use it up ‘till there is no more. Good ppl naively believe the world will always be good. Bad ppl will always use that thinking to exploit and imperil the world. How do we communicate if there are no agreed upon facts? We have lost our way. Will we ever find our way back? This seems like then end of civilization. When behemoths like Facebook have meetings with this administration in secret. It confirms that Zuckerberg’s was purposeful about permitting fake political ads not just for money. But, to move his personal & political agenda. He is has exploited the fake news. When we permit a foreigner, Murdoch who runs most news outlets print and TV with fake news. We have permitted him to drive the agenda. People who are empty and yearn for power and wealth at all cost. You can try to fill that emptiness with money & power over others. But, the emptiness will never be filled. Human contact, human joy, can only fill such emptiness. McConnell the most powerful man in the USA must be proud his state is #50 in education. Ignorance is valued in Washington.
Andre (Germany)
Let that sink in for a moment: With the emergence of this technology, bad actors, criminals and conspiracy peddlers will be in a position to plausibly deny any video proof as "deep fake" and get away with it (at least in the opinion of supporters). An actual deep fake doesn't even need to be involved! Truth is no more! It could be the beginning of the end of civilization as we know it. There are dark times ahead. I fear for the future of my children.
Charles Sager (Ottawa, Canada)
The manical geniuses behind this work would find themselves billionaires in Russia. That they are doing this work within the United States of America should be enough to find them behind bars. The work they do seems to entertain them and might entertain the masses too were it not for the fact that the work is intended to deceive the masses. The resultant misinformation is both dangerous and subversive. Civil libertarians would scream that, of course the creators of this content have a right to produce whatever they want to in a free society. But would any society that is willfully manipulated in such a way remain free for long? I can’t imagine a scenario where it would.
Philip Brown (Australia)
When you cannot believe your own eyes and ears, unless you are touching the person or object, you are well onto the path to madness. The saying: "because something can be done, does not mean that it should be done"; applies perfectly to this. In a world of ignorant and credulous people. a 90% "fake believe" would be sufficient; let alone the 100% that these naive young men are seeking. You might believe that you can avoid "fake imagery" by staying off social media but this technology will be front and centre on FOX the day after it goes "rogue". Done well enough, even real news channels will buy it. All technological "swords" are double-edged but this has far greater potential for evil than for good.
mwells (Philadelphia)
The people who are developing these tools are amoral at best. These engineers know that their tools will cause great damage to democratic culture and they just don't care.
E. Smith (NYC)
Correct. This a prime example of intelligence without ethics or forethought.
Rebecca (NJ)
Please make full versions available to teachers. I would love to use this to educate my students.
New Milford (New Milford, CT)
There is no way to stop this. This is the why vs. the why not. Does anyone see why this would be beneficial to society? I don't. We need to ask why.
Misterbianco (Pennsylvania)
This reflects on the advice they gave us in the Navy fifty years ago: “Don’t believe anything you hear, and half of what you see.” That was intended to limit spread of misinformation; today’s technologies are designed to advance it.
DeepSix (Jupier, Florida)
At a point in time there was no technology for nuclear weapon development. Once the technology was acquired we built the bomb. Looking back should we have ignored the technology knowing its effect on civilization? The young engineers are working the cusp of new technology and AI. There is no stopping it now.
Steve (SW Mich)
Over 15 years ago I worked in a training branch of a defense agency. I was most involved with course design, and we had IT types who worked their magic to craft digital versions of these courses. Then, it was Adobe Premier and Photoshop, which were pretty powerful, but still there were limits. Now we are at a point where "if you can dream it, it can be built". I hope we get to a point though, where everyone sees these digital creations as just that... creations; knowing what they are seeing is nothing more than a bunch of pixels and sound files, masterfully put together by people with powerful software.
Sebastian (Copenhagen)
These young engineers are completely oblivious to the moral and ethical implications of their actions, which is in itself frightening and troubling. However while the ramifications could be disastrous for democracy and public discourse a secondary positive effect might arise from the cinders. In a future where deep fakes are prevalent, news media and governments would have to be held up to a much higher degree of scrutiny when producing, delivering and disseminating information. One could imagine a technology capable of cryptographically protecting the source file from being tampered with or some other tech savvy way of certification. It might just be that in this future ,producing a provable truthful video will be more expensive than a deep-fake one. In a weird way this might just be the solution to the rampant issue of fake media and disingenuous news outlets online.
Chris (CT)
I'd say this is possibly the beginning of the end of the internet but I also think it's end has already begun. When media and "the media" can be manipulated so convincingly to alter the truth we perceive with our senses it is time to look elsewhere.
Ilya Shlyakhter (Cambridge, MA)
Digital cameras could digitally sign the pictures they take (both content and metadata like time/location), so that any alterations could be quickly detected. The technology for this has long existed; why isn't it being used?
Richard (Massachusetts)
This is great! Things are about to get more exciting and interesting! I can see the machine intelligence technology having to get better to detect deepfakes. An arms race! And don't forget, the targets of deepfakes can also create their own counterfakes and direct those at their aggressors. Ultimately, it will force into existence new ways of authenticating videos and recordings, and new markets as well. The new laws that will eventually spring up to regulate this area of technology will be insufficient from their very inception, but technological countermeasures will compensate.
Alexia (RI)
How is this different from making movies really? The fact that anyone has the power to publish is the problem, larger platforms and government will have to come to an agreement. Awareness and education will be important.
Richard Fried (Boston)
This is why we need a new agency staffed with brilliant thinkers. For every new technology we always need new ideas. For example; We did not need stop signs and traffic lights when horses were our transportation system. A horse will not run into a wall, a car will. Our home electrical system is safe because we follow well thought out rules. Sometimes we ban a technology and sometimes we construct rules for it's use. We must engage people who are qualified to work on these problems.
Jen (Kansas)
This could potentially infiltrate in so many places. These videos will eventually be so realistic that no one will be able to tell if it’s real/fake. They won’t be just used on Facebook or Twitter. This can change everyone’s view of video and people will stop believing what they see. The impact will be so far reaching. It will impact daily information from the news, education, politics and even criminal investigations. It’s frightening.
denise (SF,NM)
If you have friends who voted for Trump, you may find the majority are active on Facebook. I am not really sure this hasn’t already been done. It’s the only thing that could explain their view of an alternate political reality.
PT (Melbourne, FL)
Genies rarely go back into the bottle. That is a lesson we must learn over and over, and yet we make false steps. The fact is that the march of technology is unstoppable. There is too much to gain to be the first to do something new, that an army of researchers and tinkers are at this, and count me in. And another important lesson is that no secret can be contained forever. Deepfakes will happen/are happening, and will become indistinguishable from reality. If Hollywood can raise the dead to perform in film, it is coming to a hacker near you soon. Of course, there are also teams working hard to reliably detect deepfakes. These exploit quirks of only the known techniques for creating them, and cannot foresee all avenues. This is analogous to being able to detect computer viruses (or indeed all forms of life)... there is no master algorithm that must be employed that we can search for. I thus predict this will become a new weapon, which as it evolves, we will remain a step or two behind, just as we are not able to control the spread of nuclear weapons effectively. I won't even go into artificial general intelligence - which some (credible voices) believe will be our final undoing.
Jerry (NYC)
The only antidote to Deep Fake technology is 100% full time centralized surveillance that can authenticate requests for intel, and can verify the whereabouts of anyone and substantiate the actual situations at any time for anyone to any properly authenticated requests.
Dick Winant (San Carlos, CA)
I agree with the other comments that the technology is a frightful prospect, a legitimate concern. However, I thought the video was poorly done, sloppy. I watched it twice and learned nothing from it.
Eric (Texas)
There is a technological solution to this problem that does not involve an AI arms race. Right now we use the internet to conduct secure banking transactions. This involves encryption and authentication. The principle of encryption and authentication will allow audio and video's to be trusted. Suppose that NBC was 100% trusted. If we then went to the NBC web site we could be confident we were viewing 100% trusted information. Trusted cameras and audio capture devices will become computers which securely encrypt and authenticate what they capture. Suppose YouTube adopted an encryption and authentication methodology that only allowed videos to be posted that were captured with trusted video capture devices that provide encryption and authentication. Problem solved. I am not aware of any video capture equipment that is in use today that provides for encryption and authentication but I would be very surprised if it not now being rapidly developed. At some point YouTube will display a message when viewing a video "this video was made with a trusted device" or not.
betty durso (philly area)
Facebook is too big and has too much influence over how elections turn out, how we consume products, and how our identities are sold in the marketplace. I agree with Elizabeth Warren that they urgently need to be reined in. So many regulations necessary to our wellbeing have gone by the wayside due to the stranglehold big corporations have on our government. We have over time lost the right to clean air, clean water, and healthy food. Now comes the loss of our right to be forgotten by these greedy algorithms, opening the door to scams and outright thievery. It's time, America, to fight back against ostensibly too-big-to-fail corporations who have our democracy in their grip and don't have our welfare in their plans.
Eric S (Vancouver WA)
It would be impossible to prevent the development of this fake technology now, the cat is out of the bag. We might hope to be able to control it, but that will be a challenge. It is good that through articles like this , we are becoming aware of its existence, so that at least we can consider that very convincing evidence of a person or situation may be false. Pandora's box is already open.
mouseone (Portland Maine)
Stay off social media for anything but keeping in touch socially with friends and family. Don't get news or information from social media. It might be easier to use social media to get news, but it has the danger of creating a world that doesn't even exist that informs our decisions about the real world. Just say no.
David Evans (Vermont)
Sadly, we've already lost the battle against disinformation campaigns. Silicon Valley and the Swamp in Washington have been unwilling to make even the most basic efforts to save us besides banning a few of the worst offenders, who simply go to another social network to promote their views. With deepfakes, maybe we have a chance. Halsey Minor, who founded CNET, has announced the VideoCoin Network, a decentralized video encoding, storage, and content distribution system built on the blockchain. While his goal appears to be to undercut Amazon video services pricing by using idle datacenter servers, the most interesting aspect of the technology to me is how he talks about embedding video on the blockchain. What I take that to mean is that the blockchain will preserve the “chain of evidence” from video source like our phones and cameras, all the way through to the video player on your device. Imagine a suite of services that enables one to view what they want online without creating a self-serving misinformation-based echo chamber. Services like SocialFixer and AdBlock are at the forefront, but progress is slow, there is simply too much money to be made by the publishers and producers. I'd gladly pay so that I don't have to see misleading ads, listen to liars, or enrich the trolls. 1993-era web developer me would be ashamed, but that’s the price off progress.
mlbex (California)
Will the age of video disclosure come to an abrupt end when the ability to create deep fakes becomes widespread? We've all seen how videos from ubiquitous cell phones, or body cams on police officers has highlighted behavior that used to be hidden. How long can we trust those videos? When will the availability of this technology change that calculus so that no one will believe a video of someone pitching a fit in public, or of police officers beating an innocent person? On the lighter side, how about videos of a cat making friends with a rat, or a turtle and a hippo helping each other after a flood? We might look back on the past decade as a brief moment when recording technology was widespread but the availability of deep fakes was not, and when you could trust the video content that you saw online.
Anthony May (San Francisco)
With friends like these developing the tools that have little to no use other than to lie to people as convincingly as possible, who needs actual enemies of the state pulling our strings?
barbara (nyc)
It is not amusing. Is it almost worth stepping out of technology when it only serves itself.
Jerry Bruns (Camarillo, Ca)
What we fear and believe we create. AI evolves per the intentions of developers.
APatriot (USA)
Given the desire of political entities and governments to control what we believe, this is troubling.
Wendy K (Brick, New Jersey)
While this possibility must be exciting for the developers, it should scare everyone. It is truly a Pandora’s Box!
jrinsc (South Carolina)
Just imagine what Vladimir Putin and his Internet Research Agency will do with these ultra-realistic deep fakes in the future. Suddenly a video will emerge with Hunter Biden giving wads of cash to Vice President Biden and joking about how they stopped a corruption investigation, or President Trump dismissing any audio or video evidence of his wrongdoing as a deep fake. When listening to these young, idealistic, and utterly naive engineers, I'm reminded of J. Robert Oppenheimer's infamous quote, "When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success. That is the way it was with the atomic bomb." And deep fakes are a way to potentially destroy democracy throughout the world if we're not very, very careful.
Brendan (Ireland)
@jrinsc A fake video that fooled millions - already been done. Remember the staged "chemical attack" video from Syria? US democracy is already under attack - and not just from Russia.
alecto (montreal)
You can bet the farm that Russia is already working on this.
Sean edwards (Charlottesville, VA)
@jrinsc it’s already started. See https://interestingengineering.com/deepfaked-voice-of-ceo-used-to-steal-almost-250000-from-company for example of successful use of fake audio to steal serious cash. Next we can expect people to manipulate the stock market. And ultimately these new weapons will supercharge the disinformation attacks our adversaries are pursuing to undermine democracy, as you correctly point out.
APH (Here)
Not if you don't watch it: one of the many reasons I avoid virtually any moving images and spend my time experiencing actual rather than simulated life. Highly recommended.
Joan (Florida)
Watch and listen, critically. If you're a voter, simply reading a campaign website or even left, middle and right news isn't enough to help you become an informed voter. Facial expressions and body language give you lots of information about candidates. It takes time, but voting is a right that shouldn't be taken flippantly.
Don (TX)
This is deeply troubling. Characterizing it as a "Pandora's Box" made me think of the Atomic bomb, for some reason. I'd be a lot more impressed if these guys worked on methods to discover a "deep fake".
Carl (Philadelphia)
One way to avoid a deep fake is to not get your news from Facebook. I don’t subscribe to Facebook and don’t understand people who use this service as their primary source for news.
sharon (worcester county, ma)
@Carl And when televised political ads are "deep-faked", then what? How do we stop this from hitting main stream media when it's next to impossible to determine reality from seemingly real propaganda? This is a dangerous technology. What purpose does it serve other than to misinform?
Darrie (Nyc)
@Carl apart from facebook, there are other platforms that people go to as well to look at these fake sources.
mlbex (California)
@Carl : It isn't just Facebook, it is images taken on cell phones and police body cams everywhere. They are an integral part of mainstream news. That will all come to an abrupt end if we can't tell the fakes from the real thing.