This is why you get your news from sources that practice within ethical journalistic standards.
On the other hand, I'm for anything that will make me look 24 again.
12
Seeing is believing? Not any more.
7
Folks, your heads would spin if you knew what the NSA and DARPA are up to.
7
As a tech person, I understand that this is just algorithms manipulating pixels. Does everybody else still have the imagination to see and believe that this is so? That is isn't real?
It is a bad sign that one of the first uses of this consumer technology is a rightwing attack on the Obamas. Since the Civil War, white supremacist groups have used conspiracy theories and the like to attack black Americans.
In theory, the software is just a tool which can be used for good or bad. In practice, that is not the case. Outside of the movies, primarily bad actors are using this. Why?
A search on YouTube for any political topic will find countless rightwing conspiracy theory videos. The Trump Republicans who comment on this site already believe much of such stuff.
Today, for instance, a search for Michelle Obama on YouTube finds among its top choices several Infowars videos telling us she is really a man.
The YouTube videos will get worse now that this technology is available. The Trump crowd will find it even more difficult to know what is real.
And YouTube which cannot even stop such stuff now will be completely outgunned. YouTube is publishing, promoting, and even endorsing such stuff but cannot stop it.
17
We're entering a time where people will be required to trust more, just when trust seems to be the very thing that is eroding, and everyone's buying guns.
It is so odd. I buy the notion that religious belief is on its way out, but now that video is no longer incontrovertible evidence, the question about belief is more secular: Do you believe the good that you see or the evil?
Trusting is something you can actively do to make the world a better place. The lack of trust can nip good in the very bud. But how do we strike a balance between letting evil run roughshod and assuming the best from people first?
Although I've been away from my religious faith for a long time, I remember this, and I hope you can see the profundity of it: "Love beareth all things, believeth all things, hopeth all things, endureth all things." To embody that love is better than a gun, but you need faith for it, maybe not religious faith, but faith nonetheless.
3
I am curious as to why you believe this.
gaaah wrote:
"We're entering a time where people will be required to trust more, ..."
4
I believe one can copright photographs of themselves and sue fake video companies for violations. Winning a lawsuit against faker or an a.i. faker app designer for copyright infringement and slander or defamation of character would put a stop to this immature new boy toy.
12
Orwell could write a great dystopia based on this technology. Soon all digital records will be suspect, all digital memories will be infinitely malleable. Hold on tight to your paper, folks.
11
to make a generalized comment--there are a lot of people in the nation who DON'T read the Times or any other major newspaper. They tend to believe what they see and hear and trying to tell them the video they are now watching is FAKE is going to be very difficult especially when it is something they are AGAINST. Arresting people with guns, seeing Black people shoot cops, so many possibilities, so little time. The Pandora's box is open now so any ROGUE nation will have access to this. If they faked people out on the election, what will fake videos do when they show up in the nightly news?
Sadly, we don't have a nation of Einsteins and the rich and powerful seem to be largely CROOKS. Have a nice future, and while you are waiting for it, read "1984".
14
I recently watched the 1977 movie "Capricorn One" again. It tells the fictional story of a manned mission to Mars that was faked. Before that, in the realm of radio, there was "The War of the Worlds" panic. Of course, it has been possible to create doctored photographs for almost as long it has been possible to make photographs. Meanwhile, there is a similar reality-altering phenomenon that has tainted literary accounts going back to the earliest forms of writing.
Clearly, the potential for manipulating reality has been with us for a long time. It is somewhat comforting to realize that we have managed to come this far despite all the manipulation that has occurred, or could have occurred. Of course, it is hard to know to what extent history has been forever altered by the manipulations that have occurred, to say nothing of the untold individual lives that may have been touched, for good or for ill, by such manipulations.
One possible benefit of this technology is that it may actually tend to dull the blackmailer's tools. When embarrassing photos and videos can be so easily faked, don't they lose some of their power to embarrass? After all, they may well be fake.
More than ever, we need to be teaching people critical thinking skills. Thinking people, more than ever, need to be seeking out reliable sources of information.
11
A blaring question the author doesn't answer:
How does Reddit (or anyone for that matter) know if a clip is fake or not?
10
FakeApp creator is yet another morally-ambiguous techie creating tech that can be misused for social harm and then claiming no responsibility. When will these guys (and it’s usually guys) grow up.
13
On the flip side, to the extent the technology becomes effectively seamless—i.e. that there is no way to detect digital fingerprints distinguishing real from fake videos—people who are subjects of authentic compromising videos will have plausible deniability to claim that they are merely “deepfakes.” And they’ll get away with it.
11
Exponential advance being inherent to tech, in the near future the ability of a layperson to identify fake video will end. Facebook, Twitter deployed fake video detection will simply challenge those who live for such challenges. The idea of educating the American public to become more discriminating is a pipe dream of a few surviving Enlightenment innocents. The solution is admittedly radical: a comprehensive effort to flood the Internet with fake video. In a year that might teach skepticism. Excess and disgust as an emetic for the poison.
7
AI needs to be regulated. If a hacker could take an AI tool released by Google to create Fake App it could also be used by China, Russia, North Korea and Iran for more insidious purposes. It can also be used by businesses in unethical ways to increase their bottom line. This is a lesson that we need to learn well before it is too late. This may be the new reality but needs to be used carefully and for the good and not the evil.
It is already a problem that AI is not transparent and that we do not know why AI makes decisions it does after feeding it with lots of data. AI needs to be develop AI so that it can communicate why it does what does and also to be able to determine what foreign AI is doing. This is something that is difficult to do but if FakeApp can be used to change video without being able to determine that it was done we have given the machine independent processes that we cannot track. We should be scared of this, very scared until we can make AI transparent.
1
This is the new reality. It's not going away, so we have to learn to deal with it quickly. This development completely changes the relationship between politics and media. Now, we not only can't trust what we read online or in the media, we can't even trust what we see and hear. People are going to have to think much harder about what they're see and hearing. How many people are up to that task?
13
yep. nope. None of these are even remotely okay. Maybe there are better fakes but Mr. Roose, none of them look like you for more than a moment.
To quote you earlier, you'd have to be blind to mistake any of them for you.
2
Last night, my wife tried to take a picture of me for use in a gag Facebook app that can tweak your image. "No thanks," I said with a chuckle. "I prefer to not have my image on random third-party servers, thank you very much. Paranoia."
...
You'd better believe this article was just shared with her!
13
The Genie has been out of the bottle for a long time. This is just the latest twist to an old story.
The less you do online the better. Offline is the new luxury.
15
the software could be used in a "documentary film"--you know where they are trying to tell you some TRUTH? Some people thought the HS shooting survivors were ACTORS--and there wasn't any video--yet. Once you take away trust, all is pretty much over.
1
This is really scary. We already have a political movement based on the wonders of lying, that supports a pathological liar for president. Soon Trump will be saying that the videos of him admitting to crimes are fake videos.
How will we be able to know anything is true, written anything can be shown on video?
17
In 1997 "The Commissar Vanishes" was published.
The book was about Stalin's use of doctored images to rewrite Soviet History. Examples included two images side by side, one the original, and one having someone, like Trotsky, completely erased from the photo. This was done at the highest government levels. It was difficult, but possible.
While the sick people who think turning Michelle Obama into pornography is very disturbing, what really worries me what governments, including our own, will do with this technology now. The desire to rewrite events did not die with Stalin.
Imagine changing faces on CCTV videos to frame people for crimes. Or surveillance footage used to put a candidate in an uncompromising position and scupper his or her campaign.
Imagine , for example, if this tech had existed in 1991. Would someone have found a way to alter the video of the Rodney King beating and create a parallel and false narrative?
The implications are truly awesome--and I mean that in the original sense of that word: creating a sense of awe, which sometimes can come from real fear.
3
what really worries me what governments, including our own, will do with this technology now. ..."
YOUR government may have paid for this technology. the NSA likely has black box projects such as this to do what the Russians are doing. It isn't ONLY Russia that has spies and nefarious people. Google was hardly off the ground and Google earth was available--just put in an address and voila--someone's home, pets and children in plain--but dated- street view access. Great law enforcement tool. Great dictatorship tool. Great police state tool. Do you really believe that Google paid for that. Maybe Russian since one of the founders--Sergei Brin--is/was a Russian national.
'Blame the person, not the technology'. This argument seems very similar to 'Guns don't kill people, people kill people'. One may apply it to any technology that has been used to hurt or kil.
1
I expect that more subtle fakes will be more damaging. Once you can engineer a person's voice to say anything you only need to change the movements of their mouth to wreak havoc. The sky's the limit for unethical and nasty people with money.
9
At some point we will not be sure what is real and what is not.
And that will only increase the political divide in the country.
9
Completely overblown threat. They seem rather obvious fakes to me and any video expert worth his salt should be able to spot a counterfeit not unlike how experts detect counterfeit jewelry, antiques, and collectibles. Do you know what is a more dangerous threat? This term nonconsensual pornography. Tell me Is it nonconsensual pornography if you picture a person naked in your head? What if you draw someone naked? If it’s unethical then shouldn’t it be a crime? Congratulations you are now only one step removed from creating thought crime.
4
First of all, most people aren't video experts. The "Mueller Time" video I mentioned earlier is only obvious because you already have enough information (Trump is not in jail) to know it's not true. Jewelers and other experts have a great deal of prior knowledge on which they base their decisions.
When the video is about people and events that the viewer has no prior knowledge of, I can easily imagine them being fooled.
5
how experts detect counterfeit jewelry, antiques, and collectibles. ..."
HOGWASH.Museum are chockablock filled with fakes and it takes more than a "good eye". I guess anything goes in TX.
4
Now Hollywood can redo the white washed movies of the past by putting more non-Caucasian people's faces on white actors. And now we can make Academy Award winning actors into porn performers too. Let's face it, the public wanted this for so long. We've had photoshop, body doubles, stunt artists, green screen visual effects, and cool make up technology that makes celebrities look completely different in movies. Why not become short movie makers ourselves and recast stuff from the past? I think this might fuel more creativity by allowing average people to become artists. And more content creators might try to create more content that are actually memorable. Half of the Oscar nomitated movies aren't seen by general audience anymore. Only those with big ad budgets will get seen. Maybe it's not a bad idea to convince people to create content without asking for too much money in the first place by the consumers who already have so much media contents to choose from. If you allow people to buy art, you allow them to do anything with it. So allowing this technology allows anyone to really recast short clips for fun. If we can have guns, we can have this technology.
1
I think you failed to notice the COMPUTING POWER required for just some basic misuse. However, governments with Cray computing capability will waste no timemaking use of this---probably going on as we speak. It was downloaded a 100000 times already.
3
Personal technology is all about making people stupider and lazier.
Social media is all about making civilized people into tribal animals.
AI combines both of these wonderful technologies.
It's a win-win situation...for the creators and users of these systems. It's a lose-lose situation for everyone else.
3
This is very scary. We have a (non-)Reality TV star as president, who won’t allow our own national security agencies to stop Russian attempts to interfere in our upcoming 2018 elections.
We already know that they successfully used Fake News to interfere in our last election. With the addition of this new, powerful technology and the demonstrated gullibility of many of our citizens, the Russians may succeed in changing the outcome of this next election.
An election that could allow the American people to have a true voice in changing the current corruption of our democratic norms and institutions—by Trump and the unscrupulous backing of the Republican Party. Or an election that could solidify Trump’s power for the indefinite future.
14
Thanks a lot for sending tens of thousands of people out looking for the fake Obama video and inspiring the creation of even more of them. Seriously, did you really need to use that particular example? And start your piece with a salacious description of it? Couldn't you have just said "video of a famous person" and left the name out completely and the details to the imagination? New York Times, what journalistic purpose does it serve to take the image of the first lady and defile it that way?
23
Perhaps this need not be seen as entirely bad. The world got along for many years without the existence of video evidence, so if we once again come to a place in which a video is no more real than, say, an oil painting, I am not sure that damage need be all that great. What will be strange is the transitional period in which young people don't treat video as being more real than animation but older people still do.
3
This is an extremely important topic. Not only can you now impersonate someone's face but using Adobe's VoCo software you can get them to say anything you want if you have about 1.5 hours of original voice audio. This is just what we as consumers have access to. Think about the capabilities and computing power that governments must have. We desperately need a new, secure, and digital means to confirm identity and validate content before it's too late. This is the kind of stuff that could easily start major civil unrest if not outright wars.
47
"This is the kind of stuff that could easily start major civil unrest if not outright wars."
I agree. The Arab Spring was coordinated via social media. At the time, it seemed a marvel. With the Internet, the entire world would have access to the truth, rather than only what the media (or government) told us. It has fulfilled that promise to a significant degree. Some smart people also predicted that the overwhelming amount of available information would lead to people filtering information according to their confirmation bias. True, again.
As this tech rapidly advances, it will be increasingly difficult to discern real from fake. We need tech-savvy people speeding ahead with countermeasures/solutions.
There has to be a will, though. After all the election-meddling by Russia, Tillerson says it's doubtful we can prevent more of it. Where are the young visionaries America is known for? They're probably stifled by the old men running our government.
3
Every bit of information will have to have block chain like security associated with it. Everything must be viewed as capable of having a "man in the middle" attack performed on it.
The problem here is that we have quantum leaped our ability to override our senses with physical stimuli; simultaneously, we are back sliding on our critical thinking skills.
28
Scary way to either frame people for crimes, or create phony alibis, but doctored photos have also been around for a long, long time. As for fake news, we've had that for millennia: it's called religion.
27
trump has found his plausible deniability for upcoming unreleased russian vids.
11
Dr Strangelove 2 may be coming
How I learned to stop worrying and love technology
5
Someone needs to come up with machine learning that FB, YouTube and Twitter can deploy that automaticity detects these and other fakes and imprints them with the word FAKE in large, red letters.
47
Correct, which turns fake video into just another form of spam. Taking the treatment of spam as a model (because it's already familiar to the providers), they could just stuff it away in a "fake video" quarantine, so you could watch it if you really want, knowing it was flagged as fake.
5
what could go wrong? Read the last 50 years or so of sci-fi novels about technology and that would be a good start. We are already living "1984" and worse and we have a government that GOES ALONG WITH THIS. Politicians will love this stuff because they too can use it--elections coming soon. The net is filled with movie celebrities in porn poses already so why wonder what will happen. What will happen is what will happen with any technology that can be manipulated by ordinary people.
Google develped this program which likely means some government agency is behind it-NSA or some blacker group. Keep in mind Google earth where youi can put in any address and voila--a persons house, dog and kids right there. Now it will be gons don't kill people--APPS DO.
1
I look forward to the FakeApp creator developing an app for detecting FakeApp fakes. And then, of course, FakeApp2, to foil FakeApp1 detection. The endpoint will be cloud services scanning all the videos we watch to filter out fakes (with varying success), like they currently do for spam. The cost will be some entity constantly reviewing what we watch. But we already seem to accept that.
5
This is exactly how these videos are made. The deep learning neural networks used are called GANs, short for generative adversarial networks. Two networks or "intelligences" are pitted against each other. The one trying to produce a believable video undetectable as a fake, while the other is trained to call its bluff.
While I don't know if they relied on GANs in their approach, I believe the University of Washington has produced lip-sync video of Barrack Obama that is nearly undetectable as a simulation.
http://www.washington.edu/news/2017/07/11/lip-syncing-obama-new-tools-tu...
3
My skim of the paper:
http://grail.cs.washington.edu/projects/AudioToObama/siggraph17_obama.pdf
Did not find any reference to adversarial networks, but the general technique is intriguing:
(in Java)
https://deeplearning4j.org/generative-adversarial-network
(in TensorFlow)
https://medium.freecodecamp.org/an-intuitive-introduction-to-generative-...
This is how the president will convince his base that the Moscow hotel videos of him cavorting with, well, you know, is fake.
11
My husband was so upset by this technology, but the videos I looked at were so obviously fakes because if you have seen the person before you recognize their unique way of expressing themselves. The Ryan Gosling did not look like the way he really expresses himself in an interview. What am I missing?
3
Yet again. Technology may be used for good, but will almost always be used for ill as well. This is why it pays to be highly skeptical about anything said by our tech evangelists.
7
The only real defense against fake news/videos is education and a healthy dose of skepticism. I'm not optimistic.
38
While fake content creation is nothing new and the onus would only continue to be on the consumer to determine merit, I struggle to envision a scenario where this is a net benefit to humanity. To me the question is, would we want to collectively restrict or proscribe certain advancements presuming they would be a net negative. It is telling that the creators and developers don’t want their names used as they already understand that the risk far outweighs any benefit (of which the most convincing benefit seems that we’ll soon have funnier Nick Cage videos).
2
Eventually, this, and other fake creating programs might force people to finally think about what they see or read. In almost every case if the viewer just thinks: Is this reasonably true, or reasonably fake it's easy to make the correct determination. And then libel laws need to be changed to make it easier to prosecute, or sue, people who make the fakes, as well as those who spread them.
4
Is it easy? With AI and unlimited computer power the cloud offers - it might be near impossible to tell what is true and what is false.
This is just new technology - give it a few years and throw some real processing power at it. Then we might never know again what we see in the digital world is ever real.
13
it's easy to make the correct determination. ..."
That is how we got trump right? People determined HRC was the fake and trump was the real thing. SURE.
3
This is frightening in its implications - the American people are gullible enough as it is, without seeing videos that totally manipulate the truth.
We are abandoning a common reality for life in fictional bubbles. This technology combined with increasing spread of just plain old facial recognition software doesn't not bode well for our freedoms.
How about some society wide discussion, addressing if we really want to go down this path.
36
I will be surprised if the all too many ‘scientists’ who submit falsified data to journals in the hopes of getting citations and grants and promotions will not soon add these technologies to their repertoire.
I expect that soon even the foolish will no longer trust what they read or see.
On the positive side, I see locomotion and gait added to this and then our children being able to see how they would move and look if they had Kobe Bryant’s or Bruce Lee’s movements.
A new generation may grow up mimicking extraordinarily well the movements and gaits of the gifted. And of course, so will the evil.
I assert the correct term is "disinformation," not misinformation.
These are, or would be, purposefully created fakes, not unintentional errors. An attorney may caution a reporter to select the term misinformation because intent to deceive is not known.
The intent may not be to harm the reputation of one or more personages, but merely to amuse. That is the "it's just a joke" excuse.
Other AI developers attempting to discern disinformation in news media are continually challenged by subtlety of meaning, such as sarcasm.
We have fully stepped through the looking glass into a world where it is faster and easier to create a lie than identify the truth. In my estimation this is far worse than gray goo because it is already here.
16
It's been true for a long time that "A lie can travel halfway round the world while the truth is putting on its shoes."
3
The Information Apocalypse is coming. Sorry if that sounds dramatic, but I think we are barely scratching the surface of the damage that falsified news can do.
30
It’s frightening to contemplate what hackers and trolls could do with this technology. Or online dating sites! Maybe developers could figure out a way to permanently tag videos created with FakeApp - like a watermark. Also, strong, enforceable libel and privacy laws would be helpful, as would a deep, national conversation about ownership of one’s online identity.
4
It's hard to read articles like this and not think that American civil society is doomed. Trust in institutions is so low now, I don't see how we will ever be able to correct the lies. It really is becoming 1984.
18
The more technology advances the shallower it gets. Hopefully it will all evaporate soon and we can go back to drinking from a deep well instead of having to lick condensation off of rocks.
4
Starting with pigment on the walls of a cave every new graphic technology has been adopted for pornography soon after its invention, driving improvements that ultimately benefit the non-porn users.
3
This is a gold mine for criminal defense lawyers. That's not my client -- it's a deep fake!
1
How is anyone today shocked, SHOCKED, and disappointed that any new tech will find porn among its earliest adopters? This has been an immutable law since the days of daguerreotypes.
3
not just porn, though that is OBVIOUS. We recently has Its Mueller Time and while it was pretty good, this software suggests some real disturbing possibilities, both for and against law enforcement. What is to prevent GOVERNMENT from manufacturing this. I can certainly see trump using it against his enemies.
4
One thing we could do is to amend libel law. Define the production of any fake video intended to be misidentified as person X as libel against person X. Likewise, labeling video to claim that it depicts person X, when it does not, would be libel against person X.
Prosecuting someone under that law would require finding who made the video and/or who distributed it, and demonstrating their intent. Nonetheless, that would give a powerful tool to all of us to defend ourselves against a new kind of slander.
60
I think the recent GOP budget is requesting LESS for law enforcement that before. The courts will be packed from here to Mars.
2
Fantastic piece. I am amazed and worried about what this technology can create and do. In fact just this weekend I stumbled upon a snapchat feature similar to this that scared me. The very final filter option currently scans your camera roll for faces of friends to merge/layer overtop of your own. I don't recall giving snapchat the option to do cull and view my photos in that way and some of the layers or filters were really well done. On par with the images of your Chris Pratt or Jimmy Kimmel fusions posted here. This image technology is quietly being adopted and rolled out.
15
Outstanding and well-done piece, and one that I'll be integrating into some of the college courses I teach on improving our critical thinking. Critical thinking is hardly a new idea, but with some of these and similar developments in digital information, it could not be more vital. But much of this comes down to a fairly simple but key difference between people: there are those who start with evidence (critically analyzed/evaluated) and form/revise beliefs based on that, and those who begin with beliefs (from tradition, authority, 'gut feel,' their tribes, etc.) and, if evidence matters at all, it is sifted/selected/cherry-picked to confirm the pre-existing belief. For these latter folks, it doesn't matter the nature of the "evidence" is, all that matters is its relation to their chosen belief. Unfortunately we have a significant portion of Americans for whom beliefs matter far more than facts/evidence, and who are in fact skeptical/dismissive of those with expertise based on their mastery of facts/evidence ("what do experts know...I have a right to my opinion!!!). Godspeed indeed.
32
I appreciate the distinction you are making. However, one thing that also worried me during my years teaching at a community college was a tendency of teachers (in a well-meaning attempt to promote "critical thinking") to downplay "mere" memorization or "mere" knowledge. But more knowledge might help students resist outright nonsense. Also, it's often crucial for understanding what one is reading. For instance, I had students who mistook an Orwell essay set in Burma and mentioning Indians and Hindus for a story about Native Americans.
9
Great point. Perhaps the most important lesson that can be taught about critical thinking is the importance of verifying the FACTS involved in an issue before doing ANY thinking about it. The Orwell essay is a great example of how incorrect assumptions can be made even when NO intent to deceive, mislead or "spin" was intended by the author -nowadays, all too often that's not the case.
1
Excellent point, Barbara.
I always use the example of knowing when the French Revolution began (1789) and when we achieved independence from Britain (1776) as an example of why it's important to commit, for example, dates to memory. In this example, doing so allows us to see how certain social "moods" traveled.
No less important than memorizing multiplication tables, grammatical rules, or finger patterns on a musical instrument.
We must first know the facts about something before we can examine it critically.
Again, good call.
1