What Happens When Employers Can Read Your Facial Expressions?

Oct 17, 2019 · 160 comments
Steva (New York)
Here is the kind of harassment which can happen with this technology. A woman I know was stalked by a man at her place of employment. She worked at a company with an outside contractor for security, and the stalker was in a position connected to the security contractor. He had access to video, employee phones, email and the building's surveillance system. She was recognized for helping to win a departmental award. But the stalker used the surveillance system to follow her everywhere - at lunch, on break, in the parking lot, in the bathroom. He had employee friends harass her by standing near her desk and having extremely loud "meetings" when she tried to work. Sometimes this included insulting her directly. Whenever she used the bathroom, the same guy would be standing waiting her. He would then follow her. More than once this man or his friend threatened to have her fired. After this, he sent a sexually harassing video to her colleagues' company phones which may have been taken in the bathroom. This is what happens when you give surveillance technology to immature, angry people. I would think very hard about facial recognition technology. People who are not psychologically mature can access this technology.
Torben Kold (London)
Great article. Big Brother Watch did a good job in this area. Observation promotes fear, and does not create a sense of community and responsibility. Here you can get more information about https://axisbits.com/blog/Facial-Recognition-App-Software-in-2019 https://medium.com/@lancengym/how-to-beat-facial-recognition-ab118a0c37fd https://www.quora.com/What-are-the-top-AI-trends-of-2019
Samuel Owen (Athens, GA)
WOW! Had no real appreciation of this subject and the arguments presented. In many ways it seems statistical modeling i.e. math theory and computations as reliable predictors of human behavioral potentialities are an AI absolutism of precision aside from the clear Civil Rights and privacy violations mentioned. Glad there are private & public individuals fighting this commercial product and Big Business money maker. Maybe someone can invent an ATM machine that distributes free and no cost money or debit cards to those in need—no qualifiers needed. That should ‘statistically’ reduce potential misbehavior and criminality while also raising consumerism substantially. And make paid employment obsolete to boot by making that a hobby instead. Technology is displacing normal needs of human functioning at its every interface worldwide. What’s the economic end game to arrive at social equities a return to Bartering?
Eben (Spinoza)
Eric Schmidt, former CEO of Google, famously said "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place." But the surveillance environment we live will be super-charged with widespread integration of facial recognition and aggregation of data. Here's just one that's right around the corner. It could be implemented today. Let's call it Tinderillow an app that let's you walk into a bar, point your phone at a "dating" prospect, and see an estimate of their networth (based on what Zillow says about their residences (rental, s owned and how much). What a great way to filter out the losers! Without regulation of facial recognition and aggregation will make things like Tinderillow (and perhaps TinderPolitico) will make life miserable as soon as you walk out of residence, such things don't need your consent.
miriam (Astoria, Queens)
From the Washington. Post, on the new AI approach to hiring: "A face-scanning algorithm increasingly decides whether you deserve the job" https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/?wpisrc=nl_most&wpmm=1 Not just face-scanning, but voice-scanning and mannerism-scanning.
miriam (Astoria, Queens)
"You bet I have something to hide, Mr. Employer - my personal life. From you." -- William Safire
Peter (South Carolina)
Only one term for this, and it was coined 70 years ago in George Orwell's "1984", "face crime". Shutter people.
miriam (Astoria, Queens)
@Peter Facecrime - as I mentioned in one of my other posts. But the Nazis also had a term for it: physiognomic insubordination, and it too was a crime.
Rich Murphy (Palm City)
And the use of fingerprints is an invasion of my privacy and should also be banned.
richard wiesner (oregon)
As my now departed mother told me in her final years, "There are days I just don't want people to see my face."
Michael N. Alexander (Lexington, Mass.)
Because the toothpaste is already out of the tube (to coin a phrase), a quick interim solution should be pursued. If defense and job rights lawyers get smart about facial recognition technologies and begin to challenge actions, it will become less attractive to employ them. This isn’t so complicated. People don’t need to have all that much detailed knowledge about the systems: Demand to know the accuracy of the *specific* manufacturer’s facial recognition technology used by the authorities. What company supplied the gear? The software? What is the accuracy – both the detection probability *and* the p of false positives (probability that the wrong person is being identified)? For men, women, members of different racial groups? Under what experimental conditions (controlled or not?) were the data acquired? What professional organization, if any, has certified the system? Chances are that the authorities can’t provide adequate answers to these questions because they simply bought-and-deployed, blindly believing the manufacturer’s, or supplier’s, claims (which, in my somewhat distant experience, were often hyped). Repeatedly faced with such probing questions, chances are that they’ll start to back off. If not, in the best case, only genuine malefactors will be fingered, which would be a genuine victory of sorts.
ondelette (San Jose)
A philosopher and a lawyer deciding how a technology shouldn't exist. For the record, lawyers gave us our current solution to mental illness -- homelessness and law enforcement. Perhaps we should trust them with proving in court that things need to change but not with being allowed to propose solutions. Face recognition is a core research technology to understanding both the brain and how to keep AI from being malicious. These authors themselves say that faces are special, why would you be able to generate AI that was human friendly without computational knowledge of them? It's a popular hot topic for law people at universities to get a publication, and for watchdog groups to distract from the lousy job they've done watchdogging privacy otherwise. Every single argument made here can be made about GPS. But there is no call to ban it, it provides the primary excuse for using your cellphone while driving and we wouldn't want to make that illegal, now would we? Calm down and do some real research. Face recognition is attractive to researchers because of how hard it is, not because it's their key to being Big Brother. Ban what you can truly point to as its dangers -- its use for nefarious purposes by law enforcement and corporations. You never did ban corporations from demanding people's facebook account access and circumventing references to intrude on the privacy of prospective employees before. You only want to ban it now because of the evil tech demon in vogue.
JBC (Indianapolis)
Human beings regularly incorrectly infer the meaning of others' facial expressions. Machines will be the same. Nonverbal expression is too nuanced, often culturally dependent, and generally context specific.
miriam (Astoria, Queens)
@JBC And because of that there will be false positives and false negatives, as digital technology is mistakenly seen as the arbiter of truth. I fear facial-recognition technology, whether or not it is accurate.
Michael (Ann Arbor)
As the Seinfeld character George Costanza said "it's not a lie.... if you believe it."
Charles Packer (Washington, D.C.)
Facial recognition technology is a long way from being reliable enough to "police social norms." We will have plenty of time to prepare for that day if we study, right now, how the government has long been using some very familiar technology to do just that. Any driver who has received a photo speed ticket for 11-to-15 over the limit knows the feeling of having had her freedom of judgment taken away, especially if she has decades of driving experience. And if she was driving a rental car, she's had her due process taken away as well, since the rental agency will pay the fine and put the charge on her credit card without allowing her to opt for adjudication. It's bugs like this in our system that should make us hesitant to let the government use any new surveillance technology.
Larry L (Dallas, TX)
Remember that scene in Blade Runner when the detective is interviewing the Sixes? This is far more intrusive than Orwell's vision or Clarke's psychohistory.
miriam (Astoria, Queens)
Picture this scenario: A high-ranking boss calls a meeting and delivers some important news. Then facial-recognition software captures the reactions of everyone at the meeting, to see who heard the announcement as good news or bad news. Those who silently disagree with the boss risk termination. The Nazis called it physiognomic insubordination; Orwell called it facecrime.
ondelette (San Jose)
@miriam, um, no. You can do the entire procedure you darkly wrote without any recognition whatsoever, just by filming everyone in the room. It has nothing to do with computer technology, Richard Nixon did something similar with his White House taping devices and people's voices. What it is you fear is being watched when you have an expectation not to be. That isn't face recognition, that's surveillance, and if you're worried a boss would do something like that, then you need strong labor laws, not a ban on a technology that you apparently don't understand. You are currently watched when you shop, watched when you use the internet, watched when you use GPS for directions, watched when you buy anything, watched when you work. All that either should bother you a whole lot more or you should stop pretending that you will lose something you've already lost if a computer can recognize your face. In 1998, myself and a colleague wrote about the perils of media and computer companies collecting personal information. We tried to get lawyers to attend the next meeting to give advice. None did. We have watched the scourge of big tech ruining privacy for 20 years. What is most surprising is that suddenly a single technology is to blame for it all. These lawyers are failing to tell you that your loss of workplace privacy started because they -- the lawyers -- decided in court, not in a lab, that your emails and keystrokes, even personal ones, belonged to someone other than you.
Passion for Peaches (Left Coast)
I wonder how effective this is for those who alter their faces with injections and plastic surgery. Does face recognition software still have enough reference points? Many women change their appearance on a regular basis (hair color and style, makeup, even style of walking if they change between high heels and flats). I look so unlike my driving license photo that I have had trouble getting approved at the airport. Can an algorithm make the connection? And what about those who change gender, and do the hormone therapy (and maybe surgery) that drastically alters one’s face? As for reading emotions, what of those on the autism spectrum? Or people, like me, who have ”resting (beach) face”? I look annoyed much of the time. So this sounds like a potential minefield for discrimination. But is the public surveillance aspect an infringement on privacy? I don’t think so. I assume I have no privacy in a public place. If someone wants to sell my data, I don’t care. I am tracked by other means anyway — my car’s satellite connection. Maybe I am sanguine because I don’t get up to much trouble. But I wish others behaved well, too. If facial recognition helps keep public order, I am for it.
Tony (New York City)
This facial recognition software allow the police and anyone else to shoot minorities for just walking down the street. The police and this white society are afraid of anyone not white so this just gives the powers to be to kill,arrest and get away with it. Nothing changes, these software companies have used many devious means to get pictures of minority homeless not even paying them for their pictures, laughing at them the entire time. White racist in plain sight. So how in the world can you trust anything these tech companies do. It is all for their profit nothing else. Ms Warren is right, these companies have to much power and need to be broken up. People should not be forced slaves to these rich, white companies especially Facebook who are no friends to minorities. My grandmother doesnt want her picture in some companies database how disturbing is this? The rest of the candidates better take a stand against this technology that benefits the companies not the people. If they dont then they dont deserve to be in the white house unless they do.
Blackmamba (Il)
Thanks to the great brave honorable American patriots Edward Snowden and Chelsea Manning we now know that Big Brother is a bipartisan liberal progressive Democrat and conservative statist Republican chimera.
miriam (Astoria, Queens)
@Blackmamba Why a chimera?
joyce (santa fe)
In a few years the population will increase dramatically and all problems related to population will get worse. If you think we have problems now, you may look back on these years as better years. A huge anonymous population where most people are strangers does not have the cohesion necessary for retarding crime. Individual rights and freedoms will be in conflict with criminal behavior. Where do you draw the line? How do you organize society so it is relatively safe to live your life and move around with some freedom? The US is a violent country, and this fact is not going away any time soon. It is a sad fact that the larger the population is the harder it is to maintain individual freedoms. Adjustments will have to be made, one way or the other. Another sad fact. We also need soul searching, research and education to address the roots of violence in this country. A just government relatively free from corruption and conspiracy theories, one that works for the whole population, not just a few, will be even more necessary for the well being of all instead of the few at the top. We need the best minds for this, honest, fair and capable people who understand and respect democracy. Vote with care. All our future depends upon the ability of the general public to think critically, analyse and sift through the masses of media and political disinformation. Democracy requires intelligent thought. Another sad fact.
Deedub (San Francisco, CA)
Americans should be aware that most of the protections invoked in this article protect the privacy of US citizens - maybe not permanent residents, probably not applicants for immigration status. Since it was established, the Department of Homeland security has worked to develop "Total Information Awareness" (though the phrase has been suppressed) with regard to non-citizens. Their position is an extension of the 'in plain view' rule: if a government employee is allowed to be in a place where something is in plain view, they can use the information they learn. Since they have access to all private data of immigration applicants, they also acquire data about their families. In the end, very little is hidden. Having a huge data set to play with, plus access to most other government data plus private data by contract (hello Palantir, Cambridge Analytica, et al.) DHS has used applicants for immigration (and probably their US citizen relatives) to develop and improve data mining technology. The horse is well out of the barn on this.
Bill Brown (California)
I just want the crime problem in my city addressed. Does anyone in their right mind think the status quo vague descriptions of suspects by eyewitnesses is better? ("White male, late 40s, brown hair, wearing jeans -- be on the lookout" or "black teen, dark complexion, wearing a hoodie -- be on the lookout.") The irony is academics, criminal justice skeptics and yes leftist fanatics have written, debated, and screamed ad nauseam about flaws in eye-witness testimony. Yet now an A.I. solution is deemed "problematic." And let's be real: Media also talks ad nauseam about "innocents" caught up in "the system," but hardly ever talk about "the system" letting go of a bona fide criminal, which happens far more often. Fear-mongering articles on facial recognition without exception fail to mention that prosecutors have been getting confessions and convictions for a very long time based partly on identifications by humans, for instance in lineups or mug books, or by police officers. Worst case scenario facial recognition systems have succeeded in partly automating what human witnesses do. These mechanized comparison systems exhibit occasional errors. But guess what? People, whose testimony is well known to be unreliable, and it is not clear that it's any better than the dreaded machines. The point is that facial recognition is as likely or more to be accurate as traditional methods based on human identification that have been used for centuries to finger criminal suspects.
Austin Liberal (TX)
This is not an issue that needs addressing. It merely automates what manpower could do with enough time. A lot of time, a lot of manpower, comparing a photograph of a suspected perp to the hundreds of thousands of photos satisfying basic information readily gleaned by simple examination of the photo. It makes the process practical. And it allows authorities to bring perpetrators to justice that would otherwise never be located. Stop worrying about such matters. Among other reasons: It cannot be effectively banned. And shouldn't be.
Let's Be Honest (Fort Worth)
As the "increasing power of one" or a few humans to cause more and more harm through technology increases, perhaps a society where there is substantial surveillance of us all, available to see by all, could be a good thing, perhaps a necessary thing. It would mean that -- like in the primitive days when humans largely lived in small tribes -- everyone could know a large amount about everyone else. It could discourage negative behavior, and promote positive behavior. If we could see what human faults were common, we might become more forgiving of them.
Bill H (Champaign Il)
Jim Jordan and Alexandria Ocasio-Cortez the two nuttiest extremists. Two peas in a pod. I'll bet there is lots they can agree on.
Jack Klompus (Del Boca Vista, FL)
Another example of the human race being way too clever for its own good. We'll be the first species ever to go extinct because it was too intelligent.
MPO (San Francisco)
Get over it. We'd all be better off with less anonymity and more honesty. We live in a society and are responsible to each other. What are you trying to hide?
stan continople (brooklyn)
@MPO There's a word for someone who has nothing to hide: "dead".
MPO (San Francisco)
@Linda i don't have a problem with that. A couple weeks ago i was in a hit and run accident. I'm guessing it was not the first time the perpetrator has done something terrible, and with a "social credit" score we could deal with such people accordingly. The political aspect is troubling to be sure, but on the whole I don't care what ideology is in charge - if more info/surveillance motivates us to be better to one another I'm all for it.
Andrew Dabrowski (Bloomington, IN)
Sorry, you can't ban technology, someone will continue using it: the NSA, China, the Mossad, name your favorite conspiracy actor, and corporations will hire teams of lawyers to enable them to skirt the law. The only solution is to make the technology available to everyone: we should all be able to image search for "Jeffrey Epstein and Trump or Clinton".
Steve (Manhattan)
As a Tech person who has been in the field for decades, I see nothing wrong-what-so-ever with Facial Recognition Software. This article appears to have been written by two paranoid persons. Facial recognition software will not endanger one's privacy, freedom and it's not "biased". If your a law-abiding citizen who pays their taxes, you should have no concerns here. Non-Paranoid Jaded NYC Person
Hal Paris (Boulder, colorado)
Wanna blow your mind? Watch msnbc's Richard Engels recent China report. You can find it on you tube. If it doesn't scare the bejeszus out of you,......well just watch the report. With this kind of technology Big Brother is Always watching. 1984 come to life in a big way while we blithely go about our business hping for the best.
stan continople (brooklyn)
@Hal Paris The Chinese don't seem to mind being a land of 1.4 billion grinning robots, as long as they can buy cheap consumer goods. For every dissident, there's a million true believers. China has been a strictly hierarchical society for 5 thousands years, where its most contentious members have been bred out of the gene pool. Emperor Xi knows his people well.
Still here (outside Philly)
If your boss will listen, this is unneeded.
Dejah (Williamsburg, VA)
Cautionary tales: In grade school, I had a "look alike," a girl who looked SO MUCH like me, our teachers could not tell me apart. Our class of 80 dubbed us "twins" and "separated at birth." We were often called by each other's names and even yelled at by one elderly nun when we didn't answer. Finally, we simply took to answering to each other's names. It was *just easier* than all the fuss and mess. My own 12 year old daughter looked SO MUCH like Jennifer Lawrence, I once mistook a picture of the teenage actress for my own child! Fortunately, my daughter grew up prettier (I am her mother, after all). But if you put a photo of my daughter, me, my father's sister, and my father's mother, side by side, we could be quadruplets. I have a mild neuromuscular disorder which makes my facial muscles (and everything else) a little weak, it also causes chronic musculoskeletal pain and fatigue. So although I am a fairly pleasant human being, people often think I am "angry," or have unpleasant inner thoughts. I don't. If HUMANS think I'm a "bad person," in a bad mood, or have Resting B*tch Face because of my facial expression, what would a COMPUTER think? I'm not a bad person. I'm DISABLED!
John (Hastings on Hudson, NY)
Given how many problems our society has with race and gender issues, it seems crazy to pursue a technology that has accuracy issues or biases in these areas. I mean, how much agony to do we want for our nation? It will only take one person falsely accused of something to rightly call all of this into question with front-page headlines. Further, if China is using it, watch out, bigly! If I had a nose job or were disfigured in a fire, would that throw off the machine?
ondelette (San Jose)
@John, the biases exist because digital cameras were developed in Japanese labs using the lab workers as material for the color algorithms, and because the face algorithms are being combined with other data in police and other use. As for your questions about disfigurement or a nose job, the answer is that yes, just like it would throw a person who didn't know you well enough off, it will throw the machine off to do that.
john boeger (st. louis)
i grew up on a small farm which was outside a small town in the midwest. most people knew everyone in the town and farms close by. everyone usually knew who the troublemakers, cheaters, thieves, et al were and could act appropriately. we did not need facial identity machines or technology. in cities and more dense areas of the country, everybody does not know everybody. however, why would people gain some sort of protection of their face when they are out in public? where is the expectation of privacy when a person is out in the public area? why shouldn't a camera record what the public can validly see for themselves?
miriam (Astoria, Queens)
@john boeger That's just it - you can't take your clothes off in a city street, but dollars to doughnuts you'll be among strangers, who don't know anything about you and probably don't care (unless you announce your views by what you're wearing). Unless you run into someone who knows you, or you're a celebrity or public figure, you're likely anonymous. Face-recognition technology changes all that.
Ellen Freilich (New York City)
You mean eye-rolls? Raised eyebrows? Holding back a smile? Looking faintly amused? Even enthusiastic? (Once in a while...) Do they need special technology for that?
BlackJack (Vegas)
Another big risk: The FBI is getting increasingly lazy about conducting actual investigations, and instead, they profile people and try to put together a case against them, based purely on circumstantial evidence produced by a computer search. Steven Hatfill won a settlement of $6 million because the FBI had destroyed his life when they decided he was behind the Anthrax attacks of 9-11, not because they had a shred of evidence he had actually done it, but because they profiled Hatfill as a likely suspect, based on nothing more than his profession (chemical biologist) and the fact he had had access to anthrax spores, because of his profession (chemical biologist). Hatfill fought back hard so they had to let him go. They moved on to profile another chemical biologist, Bruce Edwards Ivins, using the same profiling technique. They also attacked him with a series of dirty tricks calculated to destroy Ivins' life. When Ivins finally committed suicide, over his ruined life, the FBI declared him guilty and closed the case. Don't think this couldn't happen to you. It could happen to anyone.
Anette (Ex-Paris)
For anybody doubting that this technology can do more harm than benefit, I suggest to search in the Web for an interview with Yuval Noah Harari and Tristan Harris. It gives a very good inside on what is at stake with the new technologies. Facial recognition is currently implemented throughout the city of Xinjiang in China to monitor the Uygur people. Once this is installed at a large scale, nobody can stop the system from being misused. Imagine a face recognition software in Germany 1939-1945! Imagine a face recognition software in the hand of a xenophobic, sexist, racist president. In my opinion, the outcome might be far worse than the facebook manipulation in the 2016 election.
miriam (Astoria, Queens)
@Anette Xinjiang is a vast province in western China. Its Wade-Giles spelling was Sinkiang.
ondelette (San Jose)
@Anette face recognition is not the most serious worry for the Uighurs. Few people, especially young people, know this but China is about twice as big as it ever was as an empire. The parts that were never really theirs to claim, Xinjiang and Tibet (including the parts that are now in Qinghai and Sichuan), parts of Yunnan and Guangxi, Inner Mongolia, and much of the Northeast were not fundamentally Chinese. The Han, who are native to the other, smaller part, are bigoted against most of those people, and most of those people don't really want to be Chinese. Face recognition didn't create that, and banning it would never solve it. The last time China didn't like a population in Xinjiang (the Dzungars), they wiped them out in a genocide. The Uighurs are benefiting from the watching eyes of the international community that they haven't suffered the same fate, what they need isn't bans on technology but for countries like the United States to call out China and demand better. Or maybe by Americans thinking twice about buying their next Chinese pair of sneakers or cellphone.
bonku (Madison)
It's not that uncommon (but becoming rarer) to have professionals, who are highly qualified and have both the ability & desire to evaluate such emerging technologies for greater good of the country and/or the world. But there are mainly two issues here- 1) Our increasingly expensive degree buying programs, aka "education", basically force them to get lucrative jobs, mostly in private sector once they get the degrees. That deprive public service agencies of qualified & public service minded professionals. Public sector get mostly very mediocre people to start with. Our growing "corporatization" of higher education sector also enabled corrupt&/or mediocre students from rich &/or well connected families to buy "prestigious" degrees. Then they have less problem to inherit influential positions in various organizations. 2) More importantly, many decent and qualified professionals from the working class background, who were immensely benefited by past era socially beneficial policies (e.g. GI bill) during their early part of life, behaved equally badly. They became more interested to accumulate more power/money by selling themselves to anyone who can buy their expertise. This issue reduced trust on higher education & science among general public & a large section of politicians. Now that trust is at its lowest in USA & many other countries. That also help right-wing religious fundamentalists & dictators. This is a nice book, Tailspin, by Steven Brill on this issue.
magicisnotreal (earth)
Feelings are notoriously malleable. Appearances tell is nothing about strangers and are only moderately useful if you actually know a person. Even then you are wrong more often than not. The idea that you know anything legitimately useful about another person if you know what they are feeling is detrimental to the development of ones ability to reason and use language properly. The usefulness of AI recognition is limited to arbitrarily restricting others, a fair number of whom you will be wrong about anyway. Lets not forget that this programming is notoriously biased and wrong about nonwhite people.
miriam (Astoria, Queens)
@magicisnotreal Whether it's accurate or inaccurate. It can get you into trouble, or raise the constant fear of getting into trouble. Especially if it's thought to be more trustworthy than it really is.
Csmith (Pittsburgh)
If you don't want to be recognized in public for doing something illegal, don't do it.
kladinvt (Duxbury, Vermont)
Considering that the tech used on our highways for E-ZPass is flawed, why would anyone buy into this latest, less than perfect tech? If the E-ZPass tech can't even read a license plate correctly, why would anyone think this would work?
Lisa (Jones)
You should know that this is happening in job interviews also-you are asked to record, yes, RECORD, your answers to questions and that data is being gathered to see if you would be a good employee. Seriously. There is no guarantee on how and where your recorded interview is going to be used.
John Dito (Oakland Ca)
The genie has been out of the bottle for a LONG time on this. I had ex colleagues working on facial recognition for Casinos in Vegas at least 10 years ago and I guarantee those installations aren’t going anywhere. If you’ve flown through China your iris prints are in their DBs. As long as Americans keep feeding on big SUVs and lo interest they don’t give a darn about “privacy”.
Paul (Scottsdale)
Like all other technologies it won't be stopped.
Lawyermom (Washington DCt)
Meanwhile, millions are happily posting their own photos everyday. I am heartened by the likelihood that I will be further down the list of those to be rounded up. When they come for you, you will have only yourself and all your fab vacation pics to blame.
miriam (Astoria, Queens)
@Lawyermom I don't post my own photo - my vacation pics do not include me, and neither does my avatar (which I don't use everywhere). To find out what else I do to keep what privacy I have left, see Rhonda's posts on this thread.
Jason (Wickham)
Employers (and other humans) can already do this... with their eyes. It's called body language.
Mor (California)
So when a serial killer is caught using facial recognition technology, we should just let him go because...people of color? I should put my life in line every time I board a plane since the facial recognition technology that could have identified potential terrorists was banned because...privacy? This is a joke. Surveillance is necessary in a large and complex society. London is the most surveilled city in the world, with CCTV in every corner shop. And this is why I feel safe there - as opposed to American cities. Facial recognition technology is here to stay, and I hope no misguided “liberalism” will prevent the legal use of this tool to curb crime and maintain order.
Kerry Girl (US)
As Wendell Berry wrote: Be like the fox who makes more tracks than necessary, some in the wrong direction.
Tournachonadar (Illiana)
Another Angel Heart moment for America: you know, when Rober DeNiro's character "Louis Cyphre" aka Lucifer tells Mickey O'Rourke that he's already devoured his immortal soul. America is all about money-making and anything that impedes us in the pursuit of greed will be swept aside ruthlessly. If facial recognition were banned, my own Federal law enforcement agency would still use it daily as we already do. Too late to bolt those barn doors in some atavistic civil libertarian impulse. Just like protecting one's DNA is now impossible thanks to "23 and me" and other lucrative operations...
Nancy Robertson (Alabama)
A life without privacy is a life not worth living.
Rhporter (Virginia)
I say use it. Inappropriate racial profiling is already a reality. You can’t blame face recognition for that.
H Smith (Den)
Its not legal. Its search and seizure, clear as a bell. It violates the constitution. You can not take photos of the contents of your home - which is you. You can not stop somebody and question them, or take a picture. Cornell Law: The Fourth Amendment of the U.S. Constitution provides that "[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Its a flat out right that is unlimited. It applies to any organization that might take photos. And it applies to the pictures, not just the recognized ones. Some gas station pumps have vid cams in them. You should cover them with your finger when you stop.
Azathoth (South Carolina)
If it can be used to quell illegal immigration and identify those people who are here illegally, then I'll take the disadvantages in stride.
Mathew (Lompoc CA)
Agreed to all of this. I'm pretty strongly opposed to using bio-metric information for anything besides getting into my top secret spy lair. But I'm particularly opposed to facial recognition software. We don't need a system of camera's tracking us wherever we go. And use of this by government authorities is particularly troubling. Side note my next phone will probably be a Librem 5 because I don't want my phone tracking me either.
McCamy Taylor (Fort Worth, Texas)
The best facial recognition "program" is that in our right brain. People who want to hide their identities or their emotions will have an easy time fooling a computer since they will be able to read the software and know what it is looking for. It is much more difficult to get away with a lie when you are face to face with another human being. So facial recognition for "security" is a silly idea and probably a waste of money. On the other hand, it can be useful for data mining purposes since some things are quantifiable--like the amount of time a person spends looking at one color and avoiding another color. I am not really afraid of targeted advertising, so they can data mine me all they want.
Working mom (San Diego)
Privacy is dead and most of us have voluntarily, happily supplied all the tools for that too have happened. I think the benefits of facial recognition outweigh the possible abuses as long as we can maintain a free democracy where people who abuse it can be ousted by voters.
miriam (Astoria, Queens)
@Working mom Happily surrendered their privacy? Grudgingly is more like it. I know my heart sinks when I have to accept cookies at a new website (new to me). To try to live off the Net may be possible, if you're not exhausted by all those workarounds. Few people have the stamina for that.
michaelscody (Niagara Falls NY)
Given the inaccuracy of facial recognition software at present, and the fact that that inaccuracy will probably continue to a large extent, I would agree that making any decisions based purely on the results of it should be illegal. However, if a crime is photographed and a list of possible suspects is generated by facial recognition software, for instance, I see no problem with the police using this as a starting point for their investigation. It is no different from, and more accurate than, looking at suspects based on witness's memories of the event. An arrest based solely on the results would be, of course, going too far into 1984 territory.
Rob-Chemist (Colorado)
This is one of the most ridiculous set of arguments against a new technology that I have read in a long time. Even though facial recognition technology has been in use for several years, the authors do not, and apparent can not, cite a single instance of something bad happening. The known and demonstrated benefits of this technology vastly outweigh any potential and extraordinarily unlikely negative benefits. In many regards, this is no different than the scare mongering regarding the use of modern biology to generate plants with improved characteristics. A small group cited all sorts of potential horrors (the modified plants will result in that entire plant species dying, we will all get cancer, etc.). And, guess what, none of these fears came to pass and these improved plants have increased our well being.
mari (Madison)
I am not a neanderthal but I do worry as to where we are going with all this. We are being tracked at work in all ways - how much time you spend on the computer, how you are using it and how "efficient" you are based on your mouse clicks. Never mind that most data so gathered is a case of garbage in and garbage out. This is not heading in the right direction. I have always believed there is overall more good than bad in the majority of us and the universe is progressing towards kindness. Nowadays, I am not so sure. Technology is but a tool . In today's world led by a mob of unethical greedy self -absorbed folks masquerading as leaders, it is likely to fall into the right hands. We are all being willy-nilly led down the road to peridition. We have less control on our lifestyle, we are forced to adopt technology at work and home . We have advanced so much in science and yet progress has led to destruction of the enviroment and unhealthy lifestyles and a general lack of purpose. No real human connection, relentless competition when there is plenty to go around. Progress seems to have acheived material comfort and nothing more! And we all seem to have become widgets to keep the economy going around. Nobody can stop to pause and opt out even if they want to. So please don't push more technology on to me!
Mon Ray (KS)
The only people who have anything to fear from facial recognition software are people in this country illegally and other lawbreakers.
Viv (.)
@Mon Ray The US has the largest number of people in prison in the world. Pretty sure that the systems in use without AI are working quite well at locking people up.
GBR (New England)
One easy way to avoid technology interpreting your facial expressions : Botox!
Ron (Vermont)
There are other biometrics that can also be easily scanned from a distance, such as vein patterns in skin (using cheap IR cameras), changes in skin color due to heartbeat (heartbeat is also a biometric), gait recognition (how you walk), voice, how you type on a keyboard, and others. They should all be banned since they have the same problems face recognition has. License plate readers, WiFi trackers, Bluetooth trackers, and other RF based and optical trackers should be banned as well. No one should be allowed to track everyone in the world. If these technologies are not successfully banned then everyone will start using them, including you and I (it's easy to download and run a face recognition app right now yourself to track your housemates and neighbors) and then everyone will be tracked by everyone else, which includes criminals tracking cops, political opponents tracking politicians, kids tracking teachers, and teenagers tracking parents. There will be no stopping it. Google "OpenCV face" on the web. Google has introduced the ability to do real time voice-to-text transcription on phones without an internet connection so it is possible to identify people by voice and create a transcript of what they say at the same time, which makes words and phrases easily searchable per person. Specific words could be made forbidden to speak. China is already forbidding specific words on their Internet; the same thing could be implemented in public spaces right now.
miriam (Astoria, Queens)
@Ron "criminals tracking cops, political opponents tracking politicians, kids tracking teachers, and teenagers tracking parents." And parents tracking children, and criminals tracking potential victims.
Middleman MD (New York, NY)
This piece does not acknowledge the potential good that can come from facial recognition software that can read facial expressions. For example, employers and others could potentially use it to identify white nationalists who show evidence of scorn or distaste or fear when they see people of color. Software could also help to identify people who are not sufficiently disapproving of the racist policies of the Trump administration. These white nationalists blend in and, for example, may lie to pollsters about which candidates they support. AI could help to identify them so that they are kept out of positions of power where they will perpetuate the systemic racism on which the United States is founded.
Thad (Austin, TX)
We should ask ourselves what the potential benefits of these technologies are, and if we're comfortable ceding those benefits to less scrupulous nations like China. With its vast population and high resource investment, China will be at the forefront of any technology that relies on large data sets to develop it, like facial recognition and AI. When weighing the balance of security and freedom, we can't ignore the threat of totalitarian regimes being the first to open Pandora's Box.
Malone Cooper (New York, NY)
I don't see how this will be easy to control now that it is out of the box and has been proven to have some benefits. We are fooling ourselves if we believe that future technology will not soon be able to read our thoughts. With political correctness and cancel culture already gone awry, we will be doomed. Those in power will eventually have full control over everything we do, say and think. This is truly scary. Fortunately, I am old enough to realize that I will probably not be around when all this happens.
magicisnotreal (earth)
Almost nothing about tech is beneficial to the common man. All of the things you think of as benefits to you, give exponentially more benefit to the tech companies than you could ever possibly get to your detriment. Usually erasing the benefit of the benefit you think you are getting.
Edward (Philadelphia)
The real issue seems to be that we have way too many laws and way too many people totally comfortable using the state apparatus to violently imprison people in cages for what amount to social "crimes". Almost no one questions the proportion of sticking an autonomous human being in a cage for "crimes" that are three degrees of separation and an abstract ethical argument away from actually causing real harm that would require an action as violent as imprisonment to make another citizen "whole".
Professor Science (Portland)
I totally agree. Facial recognition technology is used by repressive totalitarian governments like China to control their populations and eliminate all opposition. Too late for the unfortunate Chinese people. American citizens need to act on this before it is too late for us!
John Joseph Laffiteau MS in Econ (APS08)
The application of AI in other fields, such as language translations, shows that huge inventories of words, expressions, and sentences must be collected to have a functional program. Similarly, the need for an accurate read of facial expressions would require many, many images of the subject. The simple negative correlation is that: the error rate declines when the inventory of pictures of facial expressions grows in magnitude. Thus, the subject would have to be observed almost continuously for AI programs to capture the exact nuance of each facial expression and attain a very small error rate. In statistics, a good way to reduce the error rate of sampling inferences back to the population is simply to increase the sample size. Increasing the sample size of facial expressions for a subject under surveillance would entail egregious violations of his or her privacy to collect a sufficient inventory of calibrating pictures. [10/18/2019 F 11:14 am Greenville NC]
Madeline Conant (Midwest)
Maybe one way to help equalize it would be to make sure employees can also read the facial expressions of their employer.
miriam (Astoria, Queens)
@Madeline Conant That won't equalize it much. The employer still has the power to hire and fire you, usually at will.
Marat1784 (CT)
Gosh. I’ve had employers who didn’t recognize my face, and even a few family doctors. Let alone making anything from facial expressions. Just think how many sad marriages could be improved if we had devices that had that app. Meanwhile, a few thousand ambitious medical students are considering plastic surgery as a career, one with distinct growth possibilities. And the big cosmetics companies are worried about their manifold products getting banned. So much to ponder.
Edward Allen (Spokane Valley)
There is a reason they wear masks in the protests in Hong Kong. Ban facial recognition. Make it's use by corporations, private security firms, and especially the police illegal.
Stephen Merritt (Gainesville)
This technology isn't intended to benefit ordinary people. It's purely a means of control.
bill zorn (beijing)
it's a 'prisoner's dilemma', and similar to privacy of data will likely be exploited by authoritarian states to our disadvantage. it will be developed here; the value of knowing a foreign leader's or an adversary in a negoiation's inner state of mind is too valuable to ignore, and too dangerous to not employ. our privacy rights are also likely to hamstring us. data is a national resource. the collection of big data will give advantage to the holder in multiple arenas, and be weaponized against those who fail to develop it as well, or limit themselves in their use of it.
bonku (Madison)
It's not that uncommon to have professionals who are highly qualified and have both the ability and desire to evaluate such emerging technologies for greater good of the country and/or the world. There are mainly two issues here- 1) Our increasingly expensive degree buying programs, some call it "education", basically force them to get lucrative jobs, mostly in private sector companies once they get the degrees. That deprive public service agencies to get mostly very mediocre people to start with. This growing "corporatization" of our higher education sector into another for-profit industry also enable very corrupt&/or mediocre students from rich &/or well connected families to buy "prestigious" degrees. Then they inherit influential positions in private Co & in Govt. 2) More importantly, many well decent and educated intellectuals from the working class background, who were immensely benefited by past era socially beneficial policies (e.g. GI bill) during their early part of life, behaved equally badly. They became more interested to accumulate more power and more money by selling themselves to anyone who can buy their expertise. It also reduced trust on formal education and science (by general public). Now such trust is at its lowest in countries like USA, arguably the science and technological superpower in the world. This is a nice book, Tailspin by Steven Brill on this issue.
F. E. Mazur (PA, KY, NY)
China is making evil use of facial recognition technology as included in recent reports of the Hong Kong protesters. This fact alone is all that should be needed to place serious control on its use.
justin efrie (washington, d.c.)
It may be waaaaay too late to ban facial recognition software. Walmart began their facial recognition software program a few years ago (see articles on Walmart in North Carolina and New York). Annually, Walmart has 265 million customers walk through their doors. Now, that may be 265 million transactions but if its a "people count" then we're talking about two-thirds of the American population. Its done. And they'll sell and share your information just because you walked through their doors. These companies are not ignorant to the amount of money that they can make on your identity information. NOTE: I have personally stopped making purchases at stores that have cameras. How can you I accomplish this especially when it seems that ALL stores now have cameras? Order online and don't physically shop at the stores.
Viv (.)
@justin efrie Ordering online doesn't solve your problem. It just means that they literally can associate every single thing you buy with your name, address. Online ordering does not give you the option to pay by cash as you can do in a real store. Facial recognition at Walmart and malls is done to ascertain broad characteristics of people: gender, age, ethnicity. For tracking individual (repeat) customers, tracking the cell phone of each person is far easier and less costly/complicated. That's why many malls and stores offer free WiFi. They're not just doing it to be nice. It's very easy to catalog the IMEIs off the WiFi network, and catalog it to see how frequently people return to the store.
Allen Yeager (Portland,Oregon)
The classic example of using fear and intimidation to justify an action. Facial recognition will happen because people will demand it. The public will demand that private and government agencies make sure that they are doing -everything- they can to provide safety and security. We live in a changing world-Technology will change it for the better... even for those who fear it. 
Viv (.)
@Allen Yeager Is that why there are mass shootings every month in the US? Because all of this technology is providing "safety and security"?
dnaden33 (Washington DC)
Since when has the banning of any technology been successful?
Sam (Newton, MA)
I'm not very convinced by this article. A lot of it is just innuendo with a claim of a bad effect that is then linked to something. I am not denying that it might be dangerous, but if you wish to ban something, and criminalize it, you need to make good coherent arguments backed by data. And that inevitably means fines, confiscations and imprisonment for violators. Is that something we wish to increase? In the past we have too often jumped at banning something without understanding why we are banning it. GMO bans in the EU have no scientific basis and scientists have been excluded from high level discussions when they calmly make reasonable points. There may be teething issues related to ethnicity, race or gender, but that is just because data sets are not big enough. In a short period of time computers will be able to accurately identity all groups of people. I am not denying that this technology may have issues, but until I see a coherent argument, I feel like this could be yet another moral panic. Have we properly considered that facial recognition may exonerate innocent people? All too often line ups have led to faulty identifications because humans have strong biases and witnesses in trials are prone to error. And what about the potential economic benefits of this technology. Will banning it just lead to giving our overseas competitors a huge strategic advantage. I am not saying that this technology is safe. But let's not jump the gun without careful deliberation.
Viv (.)
@Sam If you cannot say that the technology is safe (because there is not enough data) then why should it be allowed? The onus has ALWAYS been to prove that something is safe, not to just assume that it is safe and wait for contradictory evidence to arise. These issues aren't just "teething" issues. Your flippant dismissal not only betrays your ignorance and/or lack of understanding, but your elitism as well. I'm willing to bet that you've never had the problem of dealing with the credit reporting "agencies" to correct your credit report, never faced negative consequences for housing or your applications job as a result of an incorrect credit report. It's very easy to say this from the comfort Newton, MA community, where the median household income is $134K, well over twice the national average.
Eddie K. (New York)
This would seem to be another futile attempt to get the toothpaste back into the tube. I agree there are numerous potential dangers to this type of surveillance, but it is a waste of time and energy to simply wish it didn't exist. We must instead address these potential dangers head on, and come up with reasonable means to keep it under control.
bd (Maryland)
Not to mention (why not NYT?) that facial recognition software can be used to identify about 1/3 of all known genetic medical disorders, only a few of which (like Down's syndrome) have such clear abnormality that they can be recognized on sight without software. I shudder to think how this may be misused in employment, insurance etc and worsen discrimination.
Mhevey (20852)
Making unenforceable laws is a waste of time. You may as well try to keep the ocean back with a broom. Banning software on the Internet? I rather ban mosquitoes if we are going to bother at all.
Mark1021 (Arlington, VA)
I, for one, welcome facial recognition as a means to board my airplane, clear immigration quickly and end the use of annoying passwords on my computer and at entertainment venues. Yes, I have nothing to hide and I am not worried about a China-like state in the U.S.A. We are driven by commerce - not communist rule and facial recognition will make life so much easier.
Mon Ray (KS)
According to a recent survey, 97.6% of criminals were against surveillance cameras and the use of AI to identify perpetrators of crimes. Similarly, in the late 1800s and early 1900s surveys of criminals were undertaken and 96.8% of them were against the use of fingerprints to identify perpetrators of crimes. I am pretty sure a huge majority of law-abiding Americans support any techniques, including surveillance cameras, that will make their lives safer, reduce crimes and apprehend criminals. The only people who have to fear surveillance cameras and AI identification are those involved in illicit behavior.
EB (Earth)
Those who oppose government regulation because they believe it undermines freedom are actually the least free of us all.
sdw (Cleveland)
This opinion column by Evan Selinger and Woodrow Hartzog is extremely important. Not only should we oppose expanding the use of facial recognition technology, we should roll back its use in the United States whenever and wherever possible. The technology is the antithesis of the American values of due process, privacy and freedom of association. The tech industry is trying to boost profits by foisting facial identification onto the American public, and law enforcement officials apparently are getting lazy. First, this trend will turn the United States into another Singapore – safe for walking around after dinner, but suffocating in its authoritarian arrogance. The next step will be a Beijing, Moscow or Pyongyang. Warrantless tapping of our home telephone landlines will be routine, just as it effectively is today for our cellphones. What do we do when Americans in single-family homes must apply for permits to lock our doors or to hang drapes in our windows?
PJF (Seattle.)
A lot of these comments, and commentary in the article and in the news generally, makes the argument that facial recognition is not accurate enough, especially for minorities. I disagree. Actually the more accurate it is, the worse it is. And bans have been focused on government use. But the bigger problem is corporate use of it to track and manipulate consumers. It should be banned completely. We can live without it ( as we have for eons) and preserve a bit of privacy. No one but me needs to know whether I attended a concert or went to a particular movie. Or attended an anti- Trump rally.
V.B. Zarr (Erewhon)
Look up the 19th Century pseudo-science of "phrenology" if you want to see where these facial recognition programs could be headed. We need to start saying no more often, and with more vigilant awareness, to all these biometric and social media snooping programs. Convenience and/or fear-mongering are no basis on which to live, as individuals or as a society. Whatever fears or conveniences are motivating people to buy into these programs will be greatly outweighed by the obvious abuses that can result from being policed by machines. Let's not become sheeple and slaves to the machine.
RamS (New York)
(1) It is already too late and (2) it is likely unconstitutional in the US. The question is whether its use will ever be revealed in a court case but I don't see how scanning the entire DMV DB of a state or Facebook would be acceptable in finding targets of interest for unrelated crimes (which will happen). I don't think fishing expeditions are allowed in the real world courtrooms and I don't think they'll be accepted virtually. (3) There are ways around it. This is what the pro- and anti- technology people don't realise - there exist so many ways to evade, mask, or disguise (not all the same thing) your digital presence. Some are complex but as a function of need, they can be deployed.
honestDem (NJ)
No question that facial recognition suffers from lower accuracy and greater bias compared to other biometric modalities including fingerprint and iris recognition. But the real problem with facial recognition is that it can be used from a large distance under ambient lighting. This enables facial recognition users to covertly identify unwitting individuals. And despite the various regulations that prohibit use without consent, the potential for abuse makes facial recognition too creepy and too dangerous for use today without serious ethical guidelines in place. Fingerprints have been used for better or worse for almost 100 years. Iris recognition might sound creepy but it is accurate and can only be used practically from 1 to 10 feet. But face -- that's the one that we need to seriously consider before allowing into the public domain.
Archibald McDougall (Canada)
Not going to happen - the barn door has been open for too long and the horses disappeared years ago. Corporations and governments have invested too much. Too much money has been spent, and there’s too much money still to be made. Neither will willingly walk away
Sage (Santa Cruz)
The "precautionary principle applies" (as it so often should, but too commonly in practice is instead tossed aside for the sake of fashionable techno-addiction): We can always adopt some circumscribed version of this new technology later. The only real drawback to waiting is that suffered by move-fast-and-break-things corporations who cannot rake in billions quite as quickly or easily. If, on the other hand, we rush in pell mell, that could well further stress and disrupt an already overstressed and disrupted culture, long on whizz bang novelty and short on common sense.
sam (ngai)
The benefits do not come close to outweighing the risks. what benefits ? lots of risks though, look at China and Hong Kong, and India is very interested too.
honestDem (NJ)
@sam You mean the folks in India who perished after being denied food at the local distribution center when their biometric IDs didn't match? Or the citizens of China who can't leave their houses without being tracked by the latest networked face-biometric systems? With new technology comes the need for policy, regulation and ethics. This is as true for biometrics as it is for pesticides, drugs and weapons.
Honey (Texas)
If you think that the facial recognition databases are just now beginning to be developed, you're 25 years too late. I am aware of a company that provided the intelligence community with very accurate technology in the 1990s. It's not a question of it becoming "more accurate." This is nothing new and the data has been collected for decades now.
Kriss (Australia)
I'm somewhat nervous about the tech but to old to be in the frame when it's fully developed.Could be usefull in America though tracking owners of guns by purchase of weapons or ammunition if implemented at sale points.You would know how many they have and what type.
Casual Observer (Los Angeles)
Facial recognition software is not like fingerprints nor DNA, it's a guess from making comparisons and it can be expected to produce abundant false positives. However, it could be useful in eliminating people from those who simply are not at all like the comparison photo. The idea that people can be controlled just by viewing their faces is kind of beyond the purpose and capacity of just comparing faces in real life. People's moods can be identified from the contours of faces but that is not always accurate, is it. What is at work here is attempting to prevent a remotely possible bad outcome by eliminating anything that might contribute in anyway to that outcome.
cark (Dallas, TX)
@Casual Observer A "remotely possible outcome"? Please read Rhonda's spot-on comment. Facial recognition such as is already being implemented in China is a "dream" for any dictatorial type of government since it allows great control of its citizens and foreign visitors. Aside from possible criminal prosecution aspects, it can be used as a demerit system to determine who are "good citizens" (the "standards" of course being determined by the government) who can be allowed to travel abroad, own homes, cars, etc.
Mon Ray (KS)
@Casual Observer According to a recent survey, 97.6% of criminals were against surveillance cameras and the use of AI to identify perpetrators of crimes. Similarly, in the late 1800s and early 1900s surveys of criminals were undertaken and 96.8% of them were against the use of fingerprints to identify perpetrators of crimes. I am pretty sure a huge majority of law-abiding Americans support any techniques, including surveillance cameras, that will make their lives safer, reduce crimes and apprehend criminals. The only people who have to fear surveillance cameras and AI identification are those involved in illicit behavior.
honestDem (NJ)
@Mon Ray Perhaps Brandon Mayfield, the guy incorrectly nabbed for the infamous Madrid train bombing by the FBI using faulty fingerprint technique would disagree with your closing line. Biometrics and other modern law-enforcement techniques are handy but are as error-prone as their human users. Regulation, policy, ethics need to catch up with these methods so that inevitable errors are reduced. In the meantime, all people need to be concerned about AI identification by face carried out by law enforcement, especially those women and darker skinned citizens whose faces produce higher error rates using such technologies.
Rhonda (Pennsylvania)
I agree that facial recognition software needs to be banned for all the of reasons mentioned, but the idea that "They will want to identify and track you. They will want to categorize your emotions and identity. They will..." is not a thing of the future. It is happening everywhere the tech companies can get away with it, but if we want to see a present day example of how bad things can get, we need only look to China, where facial recognition technology is effectively used to keep Uyghur Muslim population contained effectively as prisoners, where each is giving a "ranking" based on whatever data the government can gather to determine how often to harass each person or family. China wants to roll out a social ranking for every single one of its residents. Imagine the implications of that. If the government decides you are unworthy, then you will never feel safe, let alone get a decent job. People talk about safety and how easily this technology can be used to identify criminals, but what it the crime is simply having your lights on past midnight? Technologies that track and identify can easily prevent people from gathering together to express discontent let alone try to fight for positive change, which is in part why the Hong Kong executive, catering to China, recently banned face masks in protests.
expat (Japan)
@Rhonda Do you have a cell phone? Is it on? You're carrying a tracking device. Does it have your contacts and browser history on it? Your life is an open book.
Rhonda (Pennsylvania)
@expat I do, and I'm apprehensive about it. I keep location tracking off (knowing full well my location can still be determined), but I rarely use apps. I also don't carry it everywhere I go. I never installed Facebook on my phone and deactivated my account close to two years ago, and was disgusted by the amount of data it retained on me (I downloaded it). I know I give up privacy and security every time I use the internet (just as I know every purchase I make with credit cards and debit cards is compiled and analyzed), but I shouldn't have to live as a hermit underground--and who are we kidding, there is no "safe" hiding space should things take a turn for the worse. Much data was being collected before we even thought about such things, and it's too late to undo that. The way it goes these days, employers also judge you by NOT having open, publicly accessible social media accounts. What you do or don't do is judged.
Bill Brown (California)
@Rhonda I just want the crime problem in my city addressed. Does anyone in their right mind think the status quo vague descriptions of suspects by eyewitnesses is better? ("White male, late 40s, brown hair, wearing jeans -- be on the lookout" or "black teen, dark complexion, wearing a hoodie -- be on the lookout.") The irony is academics, criminal justice skeptics and yes leftist fanatics have written, debated, and screamed ad nauseam about flaws in eye-witness testimony. Yet now an A.I. solution is deemed "problematic." And let's be real: Media also talks ad nauseam about "innocents" caught up in "the system," but hardly ever talk about "the system" letting go of a bona fide criminal, which happens far more often. Fear-mongering articles on facial recognition without exception fail to mention that prosecutors have been getting confessions and convictions for a very long time based partly on identifications by humans, for instance in lineups or mug books, or by police officers. Worst case scenario facial recognition systems have succeeded in partly automating what human witnesses do. These mechanized comparison systems exhibit occasional errors. But guess what? People, whose testimony is well known to be unreliable, and it is not clear that it's any better than the dreaded machines. The point is that facial recognition is as likely or more to be accurate as traditional methods based on human identification that have been used for centuries to finger criminal suspects.
Bill (CT)
This technology is used extensively overseas especially England. Just walk down any street in any British city and look up-you will see CCTV cameras at every street crossing. Ask yourself,wouldn't you rather have criminals and terrorists spotted before they do something bad? As for job interviews, your resume has been scanned by a computer and you've been cross referenced on Google and LinkedIn and Facebook and an algorithm has rated you before you get to the interview.
V.B. Zarr (Erewhon)
@Bill You're absolutely right about job applicant A.I. pre-screening and the British surveillance state, but that stuff needs to be challenged (and rolled back) too.
Eli (NC)
@Bill I have several clients in England and I have never dealt with such paranoid, suspcious people. I have to laugh because several were so paranoid that they delusionally refused to accept large sums of money from an intestate estate because it "must be a hoax." This is despite a 254 page sworn affidavit of genealogy. Indeed, they think even the reputable probate attorney is "part of the plot" and that the American probate proceedings are a sham (they can be easily verified online through the Clerk of Court). I have found this every time I have to deal with Brits; if they are all this delusional, perhaps they need complete control and oversight by their government at all times.
Larry L (Dallas, TX)
@Bill , then i should have the same right to turn on the videorecorder on my phone and scan the interviewer and analyze the facial expressions. Is that right being enforced? I didn't think so.
J. Cornelio (Washington, Conn.)
Barack Obama himself declared that if it would save "one child's life" then putting tens if not hundreds of thousands of human beings on a 'no-buy' gun list would justify that list. In other words, safety concerns will almost always trump (pardon the reference) privacy concerns. Just wait for the news story and the next news story and then the next news story which report that if, facial technology had been utilized, then this, that or the other horror would not have occurred. Nope, we are on our way to 1984 and, as so many sages have noted, it will be completely voluntary. All I can say is that I'm glad I'm in my mid-sixties and am on my way out of what appears as inevitably as dystopian future, one which we fear-filled creatures will have created for ourselves.
Alan (British Columbia)
@J. Cornelio, for it to be completely voluntary there should be informed consent. How many people reading this page understand that there are dozens of trackers following their browsing on this site and most others? The Privacy Project is hugely ironic.
Apathycrat (NC-USA)
@J. Cornelio I'm sad to say that I largely agree w/ your prediction. As someone having spent decades in tech, I think each citizen/consumer needs to assess each new technology and decide if it's worth it or not... not only in liberty/privacy terms, but even general quality of life... or geez even personal financial soundness (a la gold-plated iPhones lol).
RamS (New York)
@J. Cornelio If that's true, then why not go after guns?
Jack Frost (New York)
The moment universal digital recognition software is used a person loses their privacy, dignity and their right to exist without the scrutiny and invasion of their personal space. Digital recognition technology allows people to scanned like cattle being led to slaughter. The dangers of this technology may not be readily apparent nonetheless the danger does exist. We have the right to be anonymous. We have the right not to be recognized and recorded or have electronic images of ourselves distributed without our explicit permission. Digital facial recognition in its present state is unsupervised. No one is responsible for wrongful identification at any level. There is no way to clean data bases or confirm that the images scanned and stored are correctly identified. There are no laws in place that govern the administration of digital recognition data bases. Law enforcement and security personnel answer to no one. What corporate entities have any policy regarding digital recognition software and its uses? Who has the right to access any data bases that may contain digital facial recognition data bases? And who determines how the images are used and distributed? How are wrongs identified and corrected? There are thousands of unanswered questions and we've just skimmed the surface of the issues. How do we protect the rights of medical patients, prisoners, students, children, and citizens across the nation? We must end the use and spread of this technology now.
Mon Ray (KS)
@Jack Frost According to a recent survey, 97.6% of criminals were against surveillance cameras and the use of AI to identify perpetrators of crimes. Similarly, in the late 1800s and early 1900s surveys of criminals were undertaken and 96.8% of them were against the use of fingerprints to identify perpetrators of crimes. I am pretty sure a huge majority of law-abiding Americans support any techniques, including surveillance cameras, that will make their lives safer, reduce crimes and apprehend criminals. The only people who have to fear surveillance cameras and AI identification are those involved in illicit behavior.According to a recent survey, 97.6% of criminals were against surveillance cameras and the use of AI to identify perpetrators of crimes. Similarly, in the late 1800s and early 1900s surveys of criminals were undertaken and 96.8% of them were against the use of fingerprints to identify perpetrators of crimes. I am pretty sure a huge majority of law-abiding Americans support any techniques, including surveillance cameras, that will make their lives safer, reduce crimes and apprehend criminals. The only people who have to fear surveillance cameras and AI identification are those involved in illicit behavior.
Dom (Lunatopia)
We actually need nothing short of a constitutional amendment codifying the right to privacy that goes as far as any data related to a person whether it´s their face, dna, how they move, what their voice sounds like, what they think; which would require explicit consent; with all the terrible things that happened to humans in the 1800s and 1900s the oppression to individual expression and freedom this technology can unleash... one only need to look to the great country in the east for a preview of what´s to come.
Elizabeth Moore (Pennsylvania)
@Dom And when what you imply happens, people who refuse to give their consent can be denied any access to the internet at all. In fact, business would be most prudent to exclude such people from the use of their services. Why? Because it creates a series of circumstances that would allow a person to sue companies of all sorts for the "abuse" of their data (voice, for example) whenever the person actually chooses to use their voice, fingerprint, or other biometrics to activate software or equipment of any type. For example, there are now television remote controls that are voice-activated. There are also virtual assistants like ALEXA, SIRI, and CORTANA which are also voice activated. To USE THEM a person must choose to call out to them using their actual voice. The controls you mention would allow actual users of those types of technology to sue the companies that produce them under all sorts of circumstances, which makes no sense at all. It will completely stifle innovation and put an end to the internet as we know it. A much more viable solution is for all data collection to absolutely end, for search engines to charge businesses for priority in searches instead of relying on advertising, and for all search engines or social media websites to be subscription only, where users' data is not harvested, but they must pay a monthly or annual subscription in order to use the service. Pay-as-you-go Internet is the answer.
Mon Ray (KS)
@Dom According to a recent survey, 97.6% of criminals were against surveillance cameras and the use of AI to identify perpetrators of crimes. Similarly, in the late 1800s and early 1900s surveys of criminals were undertaken and 96.8% of them were against the use of fingerprints to identify perpetrators of crimes. I am pretty sure a huge majority of law-abiding Americans support any techniques, including surveillance cameras, that will make their lives safer, reduce crimes and apprehend criminals. The only people who have to fear surveillance cameras and AI identification are those involved in illicit behavior.
Edward (Philadelphia)
@Dom I shouldn't need your consent to use a camera with facial recognition on my property or aimed at a public space where you have no right of expectation of privacy to begin with.
Robert David South (Watertown NY)
Useful technology cannot be banned. Once discovered, the genie cannot go back in the bottle. All you can do is create unequal access to it and/or add compensating changes around it that make up for its effects. Sure, this argument is similar to "when guns are outlawed, only outlaws will have guns." That's because it's technically true. It's just that actually guns are never fully outlawed: the police and military (at least) have them, and/or licensed regulated trustworthy citizens. When facial recognition technology is outlawed, that will only prevent citizens, businesses, and authorities (good, bad, and indifferent) from using it. Criminals will still have it because if you think it's hard to keep guns, drugs, or nukes from proliferating, try keeping a lid on software.
j03y_ (Washington Dc)
@Robert David South I don't expect or fear isolated criminals establishing national databases and infiltrating the government and police forces to use this technology. Nuclear weapons are "useful technology", and they can be banned. Also, we may not be able to ban the algorithms at the root of facial recognition themselves - these are clearly the subject of university and corporate research. We can however, ban governments and businesses from deploying them en masse.
Jorge (Pittsburgh)
@j03y_ — With regards to your not expecting criminals infiltrating the government, I suggest that you read the news.
Samuel Owen (Athens, GA)
@Robert David South Technology is merely a tool so it certainly can be removed from use. And tools need not be inanimate things only. Soldiers, slaves, employees & gang members come to mind.
W in the Middle (NY State)
Really... And what of: > Fingerprint recognition > Voice recognition > Iris/retina recognition > DNA recognition Beyond recognizing STEM as a four-letter word, you don't even begin to recognize what the world really works like...
Mr. Chocolate (New York)
The moment I'd get noticed for skipping church I'd move back to Europe 😂
miriam (Astoria, Queens)
@Mr. Chocolate Wouldn't the people in the church know you weren't there?
August West (Midwest)
"Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections." Pretty much says it all. Wake me up when NYT is ready to walk its talk. Until then, this seems much ado about not much.
expat (Japan)
I have students who work in this area, and while I largely agree with the authors on facial recognition software, if we curtail or restrict the development of technologies because we fear they will be misused by governments and institutions who would violate human and civil rights, we will soon find ourselves with limited technological resources. It is frequently the case that what scientists learn in the development of one technology makes other technological innovations possible. Finally, who is empowered to determine the potential threat posed by these technologies?
Mannyar (Miami)
Unfortunately, the face recognition genie can't be placed back into the proverbial bottle. It's too late. Military, intelligence, federal, state and international adoption of this technology is widespread, and now it is quickly seeping into the private sector. China has perfected its use integrated into its "social credit" monitoring system, and no one can completely regulate this technology again. In the future, there will be no privacy, as data linkages will connect social media, industry, intelligence and commerce together to access a trove of information. That ship sailed long ago.
Jack Frost (New York)
Imagine if the Nazis of World War II Germany had advanced digital facial technology. Now picture criminal minded politicians, law enforcement officials, lawyers, bankers, corporate moguls and others with wealth and power in the United States who care less about the privacy of others. This is a technology straight of 1984 that we are not ready for and never will be. Big Brother has just made a digital image of you and placed it into a data base to be used later in your trial. Or at your execution. We need privacy because it is an integral part of personal security and well being. Anything else is an invasion of our personal space and our right to live in peace. it's time to protect our privacy. Silicon Valley, Facebook, Amazon and other mega-tech corps and organization must be banned from the use of this deeply invasive and troubling technology. Our freedom and our democracy is at risk.
MEW (California)
Wow, this is a powerful issue the tech companies don't want us to talk about. I have no idea when we decided that because social media and the internet make is so easy to share everything, we no longer have a right to privacy. Wrong! The potential for using facial recognition to track you in your everyday lives would be an extreme invasion of privacy. Sure, dictators would love it, but a democracy has no room for facial recognition software's potential abuse. Regulate it, and quickly!
Christopher Diggs (USA)
Nothing good will ever come from this fifth element nonsense. How did we ever let real life mirror the movies.
Andrew (Louisville)
In April NYT reported that they had, legally and for less than $100, identified a man strolling at Bryant Park from a publicly available webcam stream and the man's Linked In picture. With his cooperation, NYT showed the (pretty poor) picture from the webcam and the portrait from LI. This was to me an astounding revelation about the power and potential ubiquity of FRS. Six months later I am sure that the software is better. https://www.nytimes.com/interactive/2019/04/16/opinion/facial-recognition-new-york-city.html?searchResultPosition=1
nowadays (New England)
"If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter." Isn't it ironic that the Times wants our email in order to receive the Privacy Project newsletter?
Elizabeth Moore (Pennsylvania)
My only concern is with criminals getting away with their crimes. I was very unhappy when Apple refused to allow the FBI to see the contents of the phone used by the San Bernadino terrorist who killed 14 people and wounded 22 others. Then I learned that Apple has turned down more than 175 other requests to unlock phones owned by criminals, including a request regarding a phone that may have been used for the creation of child porn. I m of the firm belief that no criminal has any right to privacy, especially when they have actually committed a crime or there is a high possibility that the phone was used in the commission of a crime. I feel the same way about facial recognition software. It is in the public interest to know who the criminals are. I believe that Law Enforcement authorities have the right to scan the faces of convicted criminals in order to identify them in later crimes or to identify escaped prisoners and the like. The software could also be used to catch sexual predators who fail to present themselves to the authorities when they move, or who move into communities where they are not supposed to be living. Public safety outweighs the right to privacy for criminals. Ordinary citizens, however, should not be subjected to any type of surveillance at all.
sam finn (california)
Technical accuracy and reliability are the only relevant questions. And the standard against which to measure technical accuracy and reliability is the average accuracy and reliability of observation skills and memory of humans observers. If the testimony of human observers can be used -- and that is exactly what "eyewitness testimony" is -- and legally it can be used -- and always has been legally able to be used. then no reason whatsoever why observation devices and analysis of the data gathered by those observation devices cannot be used legally. Yes, challenge the reliability of the "eyewitness" by cross-examination. Likewise, challenge the reliability of the observation devices and facial recognition programs by cross-examining those who design and operate the devices and the analytical programs and cross-examining those who maintain the databases. And yes, allow contrary testimony by expert witnesses (subject of course to cross-examination of them as to the extent of their alleged "expertise"). But do not simply ban the devices and the programs altogether. Why should "privacy" suddenly be introduced as a relevant factor?
miriam (Astoria, Queens)
@sam finn Why put the word in scare quotes?
Alan (Columbus OH)
When Jim Jordan, the king of complicity, and AOC, NY bartender-turned-Green New Deal and high-speed rail proponent, agree on something, it is likely either a very good or very bad idea. When that something is about limiting the power of law enforcement and is not agreed upon by everyone else, odds are it is a very bad idea. Jim Jordan does not hesitate to rant against the FBI, and the only law enforcenent he seems to like are CBP and ICE, perhaps because the trans-national criminals he favors have the background and means to circumvent them. The massive boondoggles AOC supports under the guise of environmentalism would be a giant windfall for shady contractors and likely come with no accountability. If Congress were a high school graduating class, these two would likely get voted "most likely to favor organized crime". Whatever they agree on regarding law enforcement, odds are we should do the opposite. The authors might want to watch the Columbo episode "Identity Crisis" (iirc, season 5). Imitating someone else to further crimes is a time-tested tactic - and not something normal people do for any serious purpose. Facial recognition would do a lot to reduce the effectiveness of such tactics. One can be certain criminals are using it to protect their schemes from infiltration and will continue to do so whether or not it is banned in the USA. Trans-national crime is trans-national for a reason.
Thomas (Lawrence)
Facial recognition software has tremendous potential for fighting crime and could be beneficial to us in other ways as well. It certainly needs to be regulated and rolled out cautiously, but the idea to banish it forever is extreme.