San Francisco Bans Facial Recognition Technology

It is the first ban by a major city on the use of facial recognition technology by the police and all other municipal agencies.

Comments: 178

  1. Only authoritarian States monitor their people. Every Progressive candidate and every blue state must lead the way in rolling back the police surveillance state.

  2. @Fourteen14 This is not being used to "monitor" people. It's just an automated version of photo ID by an eyewitness, probably more accurate. Nobody will ever be convicted because facial recognition erred. The courts will require much more than this. But people have been convicted because am eyewitness got it wrong.

  3. @Fourteen14 - With all due respect there is no such thing as "the policy surveilance state". I am a Libertarian and I am all for rights and freedoms, but most Facial Recognition does not do anything other than help law enforcement capture dangerous criminals who threaten us all. I understand where you are coming from, and your concerns are very real. But there are bigger things to be worried about these days than who is watching you, we should be more worried about what the people who watch us do. We need to all face that living in the 21st Century this will be the new norm forever more. -- Also, authoritarian states are not the only ones who monitor their people, they are just the ones that lock people up for saying what they don't want to hear. Surveillance is only bad when it is used with actions that infringe on people's rights.

  4. As I learn more about the capabilities and especially the pervasiveness of this 'facial recognition' surveillance, the "V for Vendetta" Guy Fawkes face mask keeps popping into my thoughts. How long before something like that face mask becomes a 'socially accepted' way of preserving the illusion of personal anonymity when in public in a crowd...? I'm sure there are 'walk recognition' and 'posture recognition' programs in the works to make sure that every illusion of public anonymity is just that, pure illusion. No doubt the NSA and police agencies will reassure us that "its all for your safety" and "if you have nothing to hide, why do you need privacy anyway?" -- although they might not actually say the second phrase. That seems to be what all the apologists for the surveillance police state seem to feel... nobody will miss privacy and anonymity too much, until it is already far too late. It may be already.

  5. @Jim Brokaw Two years ago I purchased the best new Surface Book, fully loaded. During the process of setting it up, a flash went off, without any permission request or forewarning. I was shaken and immediately shut down the Surface. I then blocked the camera with black tape which is still there. I have recently reset the Surface to factory settings, tape still on. But who knows where the stolen photo has gone, to which database it now belongs. Do I have anything to hide - No! But I belong to a time when we actively resisted governments on moral grounds: the Vietnam war, women's rights and political freedom. I stand on this ground still today and to the little munchkins who say "It doesn't matter", I say Yes! It matters!

  6. Gee, I guess the Millennial's haven't seen Clock Work Orange or read George Orwell's book 1984. Oh, that's right they make money for making it and aren't responsible for it's dissemination against humanity. Like the Nazi's of World War Two, they were just following orders.

  7. I'm pretty sure most of the people who initially made this technology weren't millenials by any means. Also, lumping a group of people into the same age category to describe the way a generation behaves is simply dumb and insulting. Generalizations anywhere just make the entire intelligence of humanity rot just a little bit every day.

  8. @haleys51 Let the state fear the People, that's the way it should always be.

  9. Some version of this tech is clearly a major turning point in our ability to both deter and investigate criminal activity. Should it be heavily regulated? Of course. But banning it outright? Seems like a knee-jerk response that ignores the major benefits to law enforcement agencies.

  10. @Dan I’d say the ban is right because recent events have shown that we cannot trust that the police will be policed.

  11. @Dan Technology should be heavily regulated - in general - yes. But have you noticed anything out of Washington other than de-regulation, for years now, in every sector?

  12. Thrilled to hear this. Our major tech companies are fallible. We’ve repeatedly been told our personal data is protected, only for the opposite to be revealed. The LAST scenario I want to find myself in is one with a corporate apparatus following me around offline and online. A literal nightmare. I can’t even be sure TODAY that one of my iPhone applications isn’t recording my conversations or storing location data it shouldn’t be. I might be living my nightmare.

  13. @ Wyatt. Apple knows where you are every minute thanks to your phone and your phone company. Unless you leave it a home of course. It’s a bit late for privacy — pretty much everyone has gps.

  14. @Wyatt Is it all or nothing? Britain has used CCTV in public places for many years to great effect. Hyperbole of worst case scenarios lead to nothing being down, which San Francisco politicians excels at - grand gestures but no substance.

  15. Disgusting. We have all lost our freedom. There is no turning back. Countless people and society will suffer. We are all just cost centers to the greater AI force.

  16. “It is hard to deny that there is a public safety value to this technology.” I deny the public safety value, the human cost, and the necessity of this oppressive and intrusive technology designed to monitor our movements and control the freedom of our action. Make surveillance a crime.

  17. The news here said today that a recent test of the facial recognition tech claimed that 23 black members of Congress are criminals. So much for it's accuracy.

  18. How do you know it’s not accurate?

  19. @nuicam That definitely is fake news!

  20. @niucame Congressmen criminals? Sounds like facial recognition is flawless.

  21. It is a brave, new world. There is facial recognition *technology*, and there is *policy* in using it. I don't think that an argument for using the technology is that it seems okay to use. If it is theoretically possible to abuse it, then, if we decide to use it, then we'll have to design safeguards in the system to protect individual privacy. This is exceedingly difficult to do to the level that it guarantees high, let alone perfect, assurance of individual privacy. But proponents see facial recognition technology's usefulness. This too is theoretical to some extent and needs to be validated; the argument for using it has to be more than a feeling, and not be based on fear, uncertainty and doubt. Facial recognition is the dual of privacy in cell phones. Installing a back door in cell phones that lets government get at the private data that is on the phone weakens the security of the phone and endangers cell phone users. Those who believe that the back door can be used to find that particular terrorist are operating on a theoretical, imagined argument. Similarly, for facial recognition, proponents imagine finding that one terrorist with the technology, but it can weaken the security of the population by being used to abuse individual privacy rights. If we deploy facial recognition throughout our society, we'll have indeed entered the Huxleyan brave new world. If that happens, I think I'll spend the rest of my time on soma.

  22. Hope they don't regret this in a big way.

  23. If San Francisco banned a tool that would help apprehend criminals, they deserve what they get. A taxpayer citizen who becomes a victim of a crime perpetrated by a criminal who could have been taken off the street before if facial recognition software was used to identify the criminal before the crime was committed against the victim, has a huge claim against the City.

  24. @Samara And when a tyrant rises up and takes power. When our rights are revoked what then? When the government knows where everyone is 24-hours-a-day. Knows what we read. How we spend our money. Who we text or speak to on the telephone. Who we vote for. What we think and believe. WHAT THEN? The idea of citizens sacrificing all rights to privacy in order to make law enforcement more efficient is folly of the first order. Willingly subjecting ourselves to this invasive terror is motivated by fear and paranoia. It is cowardice. We are Americans. And Americans do not willingly submit to the transition of their country into a police state!

  25. @tom Your post sounds paranoid due to creative worrying. Public is not Private.

  26. @Samara I assume then you'd have no issue wearing a police monitoring bracelet at all times? Skip on down there Johnny (or Janny) American, I'm sure they'd be happy to outfit you with one. I suspect you wouldn't, despite this level of surveillance being effectively the same thing.

  27. There is good reason to worry about the increasing loss of our personal privacy online and otherwise from corporate and government actors, but there is little reason to explain why facial recognition technology, and for that matter widespread use of CCTV, in public places is not already common and tolerated. Remember, this is technology that will look at persons in public places, like sidewalks, in cars, on transit, in stadiums, theaters and shops--no one has any reasonable expectation of privacy in any of those or other public places. This is not about being secure in your homes or your personal affairs, this is being out in public where you have voluntarily placed yourself for anyone to see. False positives and IDs will occur, but the advantages far outweigh and the technology wii improve. As a San Franciscan, this is simply another example of why you do not want the SF Board of Supervisors making poorly thought out decisions on a broad policy basis.

  28. This is about tolerating being stalked by someone afar. Creepy! First step to tolerating totalitarianism.

  29. @Cazanoma Recently I was in a casino and approached by security. I was told that my face matched someone that was banished from the casino. I simply showed my driver's license to prove I was not their person.

  30. @Mozart And that ended the encounter. The system worked. No problem.

  31. We have reached the tipping point and are about to embrace Big Brother. Good for San Francisco - not that it will do much good in the long run.

  32. While the ban affects the police force, it doesn't affect stores or shopping centers. One of the most memorable movie scenes, ever, was in "Minority Report," way back in 2002, when Tom Cruise ("John Anderton") walks past a number of ad displays (e.g., Lexus, Guiness) that call out to him specifically, with knowledge of his interests and past purchases. Later, after he has had some eye surgery, the displays think he's someone else, but ignoring that (it's a murder mystery story), will we want shopping malls and multi-story depaartment stores to direct us toward things we're likely to buy, just as logging into a shopping site does now? Time-saver, snoop, or both? (The video clip is at https://www.youtube.com/watch?v=uiDMlFycNrw)

  33. Luddites still exist. They mostly live in California. We have a few here in Texas. The state just outlawed red light cameras as being unacceptable; one cannot cross-examine a camera. I mean, really! When you are in public, you don't have privacy. Public :: Private. I doubt anybody will ever be brought to trial, let alone convicted, because the facial recognition software erred. Suspected, questioned? Sure. But that's it. It just a method of reducing the suspect pool. I expect it errs less than the positive IDs made by eyewitnesses.

  34. @Austin Liberal Actually, the place you are *most* likely to meet red light cameras is in California. Where you must have front and back license plates unobscured by anything so that they can be clearly read from the front or the rear, and so that the cameras can accurately read them if you go speeding through that clearly marked intersection with the sign that says "Red Light Warning: Minimum $400 Fine", Flash, Shot of Back License Plate, DMV Sends You the Bill. Pay within 20 days or the fine doubles. We don't have them in NM. Luddites live here. You have to be stopped to get fined and we allow people to drive around with dark license plate covers on the (back only) plate without consequence.

  35. @Austin Liberal have you seen how China uses it to surveill its citizens? Let’s not emulate the Chinese in this regard.

  36. @Austin Liberal. Do you know about the red light camera fiasco in Chicago?

  37. This is the right decision by San Francisco. As a lot of other states head towards "The Purge" mentality it's good to see some sanity.

  38. A sad loss for the ski mask industry.

  39. And what about all those NEST door cameras? Should Google be the only one with the knowledge of who is when, where? Can the police then subpoena them?

  40. It is time for every city and state in this country to take a long, very serious look at the issue of the individual citizen’s right to privacy. The federal government has failed all of us by neglecting to address this crucial question. Privacy-related issues brought on by advances in technology have been ignored, in some cases, for decades. The current situation is “1984”-nightmarish and only seems to grow ever more threatening. If the congress and the supreme court won’t take action to protect us, let us encourage our state and local governmental officials to address this ever-widening area of concern. Good for you, San Francisco!

  41. Ridiculous. What’s next? A ban on using fingerprints?

  42. @JQGALT ... ever heard of anyone mistakenly arrested on the basis of fingerprint evidence? How often do fingerprints produce false positives?

  43. Ever hear of someone falsely arrested based on facial rec?

  44. @JQGALT Just thought you might like to be aware that WE know who you are, where you live, what you watch/eat/buy, and even who some of your friends/acquaintance are (and what they w/e/b..). Enjoy your day.

  45. "1984" all comes true. Big brother is watching. So wear your hat low or wear a mask. Privacy? A thing of the distant past.

  46. @Pataman Never voluntarily say that privacy does not exist. That is why we are losing it.

  47. It's not going to be helpful with criminals who pull their hoodies tight around their faces, but I'd like to see the police use it - if they have a warrant. Technology that could help fight car break-ins, package thefts, maybe even tagging! Of course, our DA and courts would have to actually prosecute and punish criminals

  48. @Ty I noticed on KRON4 news over the weekend that a senior citizen wellness check in San Rafael resulted in the arrest of the senior's roommate who had been driving to San Francisco to steal packages. The roommate failed to remove the address labels. Had the senior lived in SF, there likely would not have been an arrest. Crime is taken seriously in neighboring Marin and San Mateo counties. SFPD regularly asks SF crime victims for access to the data on their home security camera systems. Much is not helpful due to hoodies.

  49. @Ty what kind of people commit the crimes you describe? Poor people, drug addicts... instead of filling prisons with petty criminals, why not fix the societal ills that create criminalty? Like poverty, addiction, and lack of healthcare?

  50. Please inform us all on how to fix those problems.

  51. Burglar. Two nights ago. Took two bikes and three tennis rackets, which were all at the landing on the first floor of my three story house by the staircase. My dog and I were asleep on the third floor. Thank God the burglar(s) was content with the stuff by the landing. Last night I had a nightmare. I thought the burglar might be back. My home has the same architecture as many late 19th/early 20th century homes in San Francisco. Richardsonian Romanesque, c.1899. I'm an honest person. I think this technology will protect people like me. The challenge is separating thieves and sexual predators from political dissidents. But we're not China. C'mon my fellow Americans! I don't think we expect all that much. Just don't steal things. I'd love it if my camera could alert me or raise an alarm every time a rapist, child molester, murderer, or thief is nearby. I'm sick of these people.

  52. @Mike you are willing to sell out your fellow citizens’ right to privacy for two bikes and three tennis rackets. What a cheap date!

  53. @Mike two words - "motion sensor"

  54. Facial recognition by police is the stuff of the Chinese Communist Party. I don't want it anywhere in this country for any reason.

  55. So if a child is kidnapped and there is video of the kidnapper, you do not want the police to use facial recognition technology to try to identify the criminal by indexing previously arrested person’s photos?

  56. @Casey Penk Are you worried about the Chinese Communist Party and not worried about the American Communist Party?

  57. There is a gap between the politics and policies of the wealthy elite, and the realities on the street in SF. My daughter lives in SF (the Castro) and I hear about the homeless, the drug needles in the playgrounds, package thieves, robberies, the mentally ill, and last week the attempted kidnapping of a 3 year old while walking with his mother in the middle of the day. I wonder about the tradeoff between liberal values versus security and quality of life for the average citizen. I'm all for security technology, if it gets me something in return.

  58. @Ben We lived in the Castro for 22 years and left. Guess why. Exactly for the reasons your daughter describes. SF is too concerned with the rights of deadbeats and drug addicts and not with overall quality of life of its (majority) law abiding and decent citizens - who pay the taxes and deserve a police force that cracks down on social outliers. The city needs to become a lot more serious about creating a liveable environment - by discouraging migration of deadbeats, finding ways to get the ones they have already off the streets and into treatment and jobs, and cracking down on disorderly behavior.

  59. @Ben the gap between the wealthy elite and the “average citizen” is widening and many of those who once thought they were “average” are now living homeless and drug-addicted as you describe. Surveilling and jailing isn’t the answer; it’s narrowing the gap between rich and poor. Hardship and poverty beget the kind of criminality you fear.

  60. The ban is premature, simple-minded and grand-standing, especially given that the police force does not use this technology. Unfortunately, those who back these measures, in particular the misguided ACLU, steadfastly refuse to acknowledge the profound civil rights costs of criminality as well as the substantial civil rights advantages to accurate police enforcement.

  61. @Shiroto the NSA our National Security Agency is already using this technology on who? regular American citizens and families who were protesting the family separation policy at our Southern border. the federal government hired a private Corporation to do facial recognition surveillance by video and which sold it to the NSA. there is no Freedom of Information Act for private corporations. Our taxpayer dollars are being spent on this. it is the beginning of techno fascism.

  62. @Shiroto what a ridiculous claim. It sounds more like you have stock in this technology.

  63. @Shiroto How does protecting the civil rights of others infringe on your civil rights?

  64. In the modern society, there are a lot of security cameras in our city. Of cause, they are installed for monitoring us, and protecting from the crimes. However, everyday we are being watched by these. When I walk the streets of the city, sometimes I catch sight of these, and honestly, I'm not favorably impressed by that. I feel, "Where is my private space in the city?" and "I'm not criminal." I understand these camera's role, but at the same time, I feel a fear and displeasure.

  65. @Takayuki - My apartment building has cameras along with FOB keys. My landlady knows every time I leave the building, which door I use, what I was wearing, what box I picked up in the mail area, who comes to visit, etc.

  66. This wouldn't be news if it didn't make me (and many others) jealous - jealous because we're afraid and realize we have zero protection, and our government isn't trying to help.

  67. Facial recognition and autonomous police drones. The near-future is scarier than most fiction.

  68. @Chris McClure so true. long term, the mature, fully baked AI will be running everything smoothly with or without us. but short term, this immature, scattershot, human-driven AI is just another tool in the oppressors toolkit. very insidious.

  69. Perhaps we should look no further than China and its camera harassment of Muslims to understand why this technology has the potential for misuse. Add misidentification to that, and in today's America, you could get shot and killed by accident. I stand with San Francisco on this one.

  70. @Manuela Why not look to Britain, where CCTV has been used for crime prevention for many years? A much more relevant comparison.

  71. @Paulo CCTV is not facial recognition. And the British Constitution is not as clear on prohibiting these kinds of abuses. And last I checked Britain its not without crime or terrorism, so what are they getting in return for freedom?

  72. An idiotic act of Luddism. Are they going to ban the use of computers next? Facial recognition technology is a tool, and, as any technology, it is not static but constantly changing and developing. There are legitimate privacy concerns about this technology, but the thing to do is to regulate, monitor, evaluate and adjust the use of the technology, not to ban it altogether.

  73. Facial recognition is useful for law enforcement in two ways. You can have a list of people you are actively seeking (e.g. people you have warrants out for) and scan crowds to locate those people. Or you can record every face so you have a record of people's activities over time which you can scan in the future when a crime is reported to see the history of possible suspects. The former case creates the danger of false positives. You might pull over or arrest or harass an innocent person who happens to resemble the suspect. Ironically, the more accurate the system, the more severe the likely mistreatment of the innocent person, because the police will believe the system. The latter case is more problematic, though. This is where the privacy issues arise, because you now have a record of the activities of innocent people under no suspicion. What are the safeguards against this data being used improperly? How long is the data retained? Who is it shared with? You can mitigate this to some extent by setting strict limits for how long it is retained, by requiring a warrant to search it, by subjecting all usage to audits. I doubt in the long term we will be able to ban all use of facial recognition but we should minimally be able to mitigate the privacy harm, and it's possible the best solution would be to ban all data retention so that only the first use case is permitted.

  74. Face recognition is about as ready for primetime as flying cars. Try Stansted airport (London) were myself and about 200 other passengers recently missed a connection because the face recognition system could not match passport pictures to the facial scans. It took the only available immigration officer about 2 hours to clear up the mess. Even if the technology were flawless, do we really want to live in a society where face recognition prevents us from boarding an aircraft because of an unpaid parking ticket... will never happen you say... It happened to me in Amsterdam Airport (one of the most liberal places in the world) 2 months ago...

  75. This is the only logical and ethical step to take until laws regulating the use of an invasive surveillance technology with major first and fourth amendment implications can be debated and passed. Other cities should follow suit.

  76. The tech kings and queens are afraid their faces will be facially recognized. The same folks that restrict or prohibit their children from using social media or gaming. Possible just fearful of lawsuits or bad press. I don't trust these folks as far as I could throw them as my late mother always told me.

  77. The same way you cannot make a color scan of currency, nor will their faces appear in any database.

  78. I work in Shenzhen, China and go back and forth to Hong Kong frequently. That means two immigration crossings in each direction. Normally, that would entail long delays waiting to talk to an immigration official, but I have signed up for "E-Channel" in both directions. That means I put my passport in the machine, it scans my face and records my fingerprint (thumb for China, forefinger for Hong Kong), and presto, I'm done and on my way. It's like E-Z Pass on the highway.

  79. The "lineup" and "6-pack photo lineup" are not perfect nor are they quite accurate. Remember the photo recognition tech is also observing the entire body in movement when you scan out and view it that way as well as full facial abilities. Isn't this better than the old "lineup" techniques. All improvements to track down and apprehend the bad guy are necessary. SFR is setting a bad example for the entire country but this is nothing new.

  80. Your gait is also unique and vulnerable to video identification through software analysis. Everything we do is monitored. Everything we do is recorded. Companies like Palantir make the tools to tie all this together for an instant inspection of who you are and what you are doing. There’s no way to opt out and still participate in society. Next up we have an onslaught of fabricated media by way of ‘deep fakes’. There will be no way to trust anything you see or hear. This leaves you with the choice of disconnecting completely, or staying tuned in and living in a fog without a compass. Either way you are completely vulnerable to those who are in control of this power. I’m not saying we should all be hoarding guns and ammunition, but culturally, we should be prepared for mass revolt, because it’s clear this is a true crisis coming together in less than 10 years.

  81. I read comments below characterizing Facial Recognition technology as a tool for monitoring behavior, but, one could also argue that the technology is foremost a tool for quickly finding the whereabouts of a dangerous criminal, terrorist on the loose, a tool to keep society safe. I have spent my youth in a police state where gathering, protesting or even publishing a cartoon mocking a leader ended you in a prison cell; I would hardly hardly call USA a police state nor an authoritarian state. Unfortunately Americans are truly confused (and a bit ungrateful) about notions of freedom, what it is to live in an open and safe society like the USA where privacy is not paramount (but it is not abused, who has been falsely accused of a crime by facial recognition tech. as a criminal and have rotten in a jail in USA? nobody! this is a country of law and order compared to police states ) Please do go and live in a real police state (like Russia, China, Turkey, All Arab countries) in repressed, closed, authoritarian societies, where even ideas, expressions are considered dangerous, where privacy doesn't even exist.

  82. Why would the suppression of freedom in other places convince us to make this loss of freedom easier to swallow. We don't want personal constant monitoring of our citizens- [email protected]

  83. @Chatelet " Please do go and live in a real police state (like Russia, China, Turkey, All Arab countries) in repressed, closed, authoritarian societies, where even ideas, expressions are considered dangerous, where privacy doesn't even exist. " This is what we do not want in the USA. Do you not believe Trump would use this technology to monitor his enemies if he had it at his disposal ?

  84. The Bay Area is the capital city of the internet. We should follow their lead.

  85. @Frank M on this, perhaps. Just maybe. On other items, not So much. Will you please name another city of less than one less than one million disposing of of a billion - yes with a B - dollars PER MONTH, that has this little to show for it!!?

  86. Happy day for criminals and creeps...

  87. Somebody is watching everybody. If you have been doing the wrong things you should be watched. If you are doing anything wrong you should not care if they watch. Get a grip. You really are not all that important. You just think you are.

  88. @Pogo This is such a short sighted remark, it is hard to even know where to start. People who are unaware of how this technology is being used in other countries should take the time educate themselves. Autocrats are using the technology to suppress freedom of speech and to identify people who question the government and their power.

  89. A tool to catch criminals is banned? Now that's paranoia! It's like NRA members afraid the government is going to take away all their guns. Or folks afraid of EZPass, thinking they're going to get a speeding ticket based on the time between two toll booths, or that the government is spying on their travels. I wish no harm to anyone voting against a powerful tool to catch an unlawful person, but if they were a victim, perhaps their mind would change. How about anti-vacciners? Paranoia is also a disease. No vaccine.

  90. @David J A powerful tool can sometimes be used for good and for bad. It's not the good uses that people are worried about. Look at how Autocrats in other countries are using this tool, and you may better understand SF's logic.

  91. @S B, when this no longer America I’ll worry.

  92. @David J When it's no longer America, it will be too late. That is the issue

  93. Good for San Francisco. Facial recognition technology is only reliable for white people. There was an article in NYT a month or so ago about how this technology is far from accurate with respect to darker-skinned people. It is very troubling that this technology is in broad use in law enforcement. An innocent black person could find himself or herself caught up in a case of mistaken identity and unable to get the police to listen or at least verify whether the correct person has been apprehended.

  94. San Francisco's intelligent ban to preserve our democracy and civil rights is necessary to prevent techno fascism which currently is where the future is trending toward worldwide. In addition cash is absolutely necessary for democracy.

  95. Kudos to SanFran, but what about a facial recognition server in San Jose? Can you send a SanFran video feed there and recognize all you want?

  96. Those people in California sure are smart. They just choose not to catch criminals. Boy, I feel terrible I don’t live in such a smart, educated state.

  97. @Shamrock They really are smart. They have undoubtedly researched how other countries have used this technology to suppress freedom of speech and criticism of government and politicians. The fool takes a superficial look at the issue, makes a snap judgement, and believes those in power when they say the technology would only be used for the good of the people.

  98. @Shamrock the tech people KNOW the downside - that is why many dont let their kids use screens and phones etc until high school!!

  99. Except that crime rate has been dropping for years before this news.

  100. San Francisco bans face recognition, but Chinese don’t ban it.

  101. Welcome news from San Francisco. The ability to be anonymous in public spaces is an important value that needs to be preserved. This is a democracy. I read about a company that uses facial recognition software to identify protesters and then sells the information. That use certainly should be outlawed. It is also important to have our homes as sanctuaries from the outside world. We should think about what it means to be human and try to preserve human qualities as attempts by companies to increase surveillance to obtain data continues at a rapid pace. Banning facial recognition software is a good place to start in the fight against Silicon Valley to preserve what makes us human.

  102. In ancient societies ever knew—could recognize—everyone else. No anonymity, yet the people were human. Being anonymous doesn’t make a person human.

  103. Everyone seems focused on the use of facial recognition technology for surveillance, with no understanding of how state and local police actually use such technology for police work. By far the most common use is for identifying unknown suspects in a crime that has already happened. For example, someone robs a convenience store and the suspect’s face is captured by the security camera. Investigators will use that image to search databases of previously arrested or convicted people to see if there is a possible match. They will use multiple factors, not just the photo, to identify possible suspects. Once they have potential suspects is identified, they will follow through on those leads the same way would with any investigative lead, with the same burden of proof. It is not even remotely any form of big brother surveillance. San Francisco is hurting their own citizens, especially poor people who are more likely to be victims of crime, by passing this law.

  104. @Nick Coult Thank you for this. I was about to write a comment asking why this is so “dangerous”. One has no expectation of privacy in public spaces. We already have photo id in our drivers licenses and passports. It seems to me that facial recognition technology is merely making the process of identifying people more efficient. And as you point out, it is only a part of the investigative and evidence gathering process. If I am out and about, shoppung or driving or relaxing in a park or even at work, anyone can see me and potentially identify me whether I want them to or not.

  105. Thank you San Francisco for taking a stand for privacy.

  106. @NYT: I think this article needs a few words about the reasoning behind the Microsoft CEO’s and the Microsoft researcher’s conclusions to the that this technology is not compatible democracy. You have given the example of the Chinese government’s actions against Muslim Uighers - is that the kind of use these critics cited? More explanation on this point would be helpful.

  107. Thankfully the people in the city at the heart of the Democratic People's Republic of California know and recognize the nature of the modern Prometheus aka Dr. Frankenstein aka Dr. Jekyll aka Big Brother aka the new gilded age robber baron malefactors of great wealth aka Silicon Valley.

  108. These systems "improve" as they are trained with more data. Data in this case being more images of faces; hence there is an incentive to record and categorize as many faces as can be acquired, and training sets are frequently shared between entities. Social media companies will "helpfully organize" your personal photos stored in their cloud so you can find images you want easily. This is done using facial recognition software; hence the "unintended consequence" of assembling a massive database of billions of users is underway.

  109. A suggestion for those disappointed by SF’s new legislation. Just write your name and SSN on your forehead when you visit the city.

  110. strange back lash over surveillance when on the flip side their taking 400 selfies a day and posting them to social media. just saying

  111. The ruling, combative, paradigms of Democracy, Republics, Communism, Socialism and all the other "isms" have a new member to add to their ranks. It's the latest, and most powerful, in a long line of tools used by the ruling classes in their never ending attempts at maintaining control and hold on their power. The ascension of the Surveillance State in association with the Corporate managerial mind. Welcome the new Boss; same as the old Boss, only this one is infinitely more intrusive than anything that has come before. It will be interesting to see how the masses push back against it. You know they will. And push seems to be coming to shove if this SF "shot across the bow" is indicative of where things are going. The struggle for power and control in its latest incarnation; it never ends does it? So it goes. John~ American Net'Zen

  112. This technology is not so much a bad thing in itself; rather, the use of the devices at random to direct people's attention is what is at issue. I agree with Marc Roterberg that it is an invasive technique which should be sidelined, like random calling on a mobile phone.

  113. As a former San Franciscan from the 1970 decade, I applaud the Board of Supervisors for its forward looking thinking. In the unlikely event that facial recognition is necessary, modifying the existing security cameras will be easily accomplished. Now that The Times has published the new legislation, what percentage of Americans will line up on each side of another divisive issue? A convincing argument can be made for dividing America into at least countries to give the disagreeing citizens a choice of where to live!

  114. We really need federal legislation that comprehensively addresses privacy. Our web activity is tracked and sold, cell phone locations tracked and sold, license plate recognition tracks the location of our cars, and now facial recognition can be used to publicize where we go. Shouldn’t we be able to take our phone and go for a walk or drive and not have where we went or what we did be used in a nefarious way. If all the politicians that are having affairs understood this I am sure they would act immediately.

  115. I am not afraid of the police using facial recognition to find bad guys who may be planning to hurt my family or me because I’m not a bad guy. Should the technology at some time in the future overstep our rights THEN it could be legally restricted. But until then I’m in favor of making it more and more difficult to be a criminal.

  116. So after the horse is out of the barn?

  117. Nice try, but the Luddites lost that battle two hundred years ago. There’s no going back; not to the garden, or to an era where privacy was a thing. They paved paradise and put up a smart parking lot.

  118. SFPD enforcing the laws? What laws? You kidding?

  119. The US never learns from history and hence it is destined to repeat it. The pot-haze hanging over the City by the Bay makes the residents conveniently forget 911, San Bernardino, Boston Marathon, Fort Hood, Orlando, Columbus, and countless others. Face recognition is by no way a mature technology but it is a hopeful tool to thwart terrorism especially in a city that coddles the many misfit and misaligned masses. Let us hope that the few sane San Franciscans move out of the city soon and will not suffer the consequences of foolish decisions such as this.

  120. @Appu Nair Amen, brother. Of course, FR may be too sophisticated a tool for those in San Francisco government. After all, they can't figure out how to put a stop to the widespread defication on The City's streets.

  121. @Appu Nair A better response would be sensible gun control—age limits, holding period and one gun per adult old enough to buy alcohol and no automatics or clips by citizens OR police. Also, better language skills in US ‘intelligence agencies.’

  122. This will be eventually overturned at some level by a tech corporate funded bill introduced by a political shill on the same bases as “Citizens United”. Now that they have invested so much money into FR, to see the it banned in SF with other cities pending, would put a large hole in their wallet. Not to worry big money always overcomes democracy.

  123. Biometrics provide transaction efficiency whether helping locate a lost child or elderly person, as well as catch a rapist. Such use contributes to a safer and more hospitable society. It is a bit hypocritical to proclaim "sanctuary" city status while banning technology that helps provide safety to those who need it most, and contributes to catching those who commit violent crimes against citizens who rely on City Services to protect them.

  124. @ Mike S. and Trump is president. What could go wrong?

  125. @Mike_S Not against the tech, but I'm not sure you quite grasp the meaning of a sanctuary city.

  126. @Indy My grasp of a "sanctuary" is a refuge, a place of peace, where families feel safe. My grasp of a City is a place where people gather to interact and live in close proximity to each other. Tools like automated biometrics are wholly consistent with making communities safer, more efficient, and more hospitable. If you don't trust the Leaders you elect to make positive use of these tools in the Public's Interest, address the actual problem by electing some Leaders you can trust. It's not like these silicon and tin contraptions are going to jump off a poll and attack somebody.

  127. They are kidding. This is like banning abortions. It doesn't mean it isn't going to happen. The users will just keep it hush-hush claiming some other factor resulted in the arrest (finger prints, corneal scan, DNA). Waste of legislative energy.

  128. Interesting that it bans government use, but not private use. My guess is that Facebook will still be collecting facial recognition data on all of us.

  129. "But critics said that rather than focusing on bans, the city should find ways to craft regulations that acknowledge the usefulness of face recognition." The critics have a point. But until such regulations are crafted, we have the choice of either banning facial recognition or letting the state exploit this technology however it sees fit. I'll go with Door B, Monty.

  130. It's a competing harms dilemma. Current power holders in China might argue that targeting small groups in order to create a more perfected state serves a greater good. There's both real and perceived harm to debate about. Using facial recognition to locate individuals who are known to have done harm or who are known to a about to to harm may serve a positive purpose. It's when we begin to indulge in predicting likely harmful behavior based on AI interpretations of, say facial expression, body language, etc., that we may get into trouble.

  131. @me AI gone bad represents public safety in danger, actually.

  132. The smarter move would have been a moratorium while regulations are developed. Photographs were a huge technological leap and, even though they are an obvious threat to privacy, today they are widely accepted and necessary, to put it mildly. Fundamentally, facial recognition technology is an advancement of photography. Like most tech advances, it cannot be stopped, only controlled.

  133. The Norwegian crime doctrine author, John Nesbo has a character Beate Lonn who has an eidetic memory which allows her to remember faces. She can essential do face recognition. Would we object to the police employing a team of people with eidetic memories? How about training officers to develop such capacities? I don't think there world be any objection. How often has generic "young black male is a person of interest" lead to tragic results because too many police can't or won't distinguish.

  134. The recent NYT piece in which a passer-by was identified based on readily available FR software and web photos was an eye-opener. Most tools which nowadays make our lives what they are, such as the telephone or Alexa, can easily be used by 'the government' for nefarious purposes but we have decided that these things' usefulness (I'm not sure about Alexa) outweighs their potential for misuse. We have accepted Hawkeye in tennis to call the lines because it is allegedly superior to human judges; and whatever defects there are currently in FR technology, such as its apparent shortcomings with identifying African-American faces, will surely be overcome as it develops. This genie left the bottle long ago.

  135. I believe they said the same things about atomic energy. My grandma said there were three sides to every story two sides and then the real side. Only time will tell.

  136. Ironic the city that has benefited so much from tech shuns it. As an aid to solving crimes no less.

  137. I think a police department with eidetic memory for faces would be a better police department. Let's make memory training mandatory for police officers.

  138. Yes, because the SFPD has lots of down time.

  139. Many citizens will cheer this move until the time comes when a perpetrator gets away with a crime where, with facial recognition technology, they would have been identified and caught. This is a more likely scenario than the use of that technology for societal control and harassment in SF or anywhere else in our country.

  140. The USA have a 6 times higher violent crime rate per capita and 6 times higher incarceration rate than in Europe . The worst police and justice system. The last industrialized nation not to have yet abolished the death penalty and far from doing it.

  141. I don’t get it. Technology can’t be used to prevent crime or detain criminals ? Are they serious? Do These civil liberty groups have nothing better to do.?.. we need more republicans in power in 2020.

  142. @Wendel Your first observation was correct. You do not understand this issue. The quality of a society rests upon the relationship between the individual and the collective or state. Due to the creep of technology, individual rights are being degraded by technology like this, which can easily be abused.

  143. @Ronn Criminals and Terrorists are counting on utopian opinions such as yours.

  144. @Ronn The quality of a society also rests on freedom from crime and the fear of crime.

  145. Not really surprising in a city that won’t investigate car theft.

  146. Swimming upstream as the river is rising, San Francisco is wasting its time and energy. If the technology exists, it will be used - this has been an inevitable reality of human behaviour since we learned to control fire and create stone tools. Banning facial recognition technology is a futile “feel good” gesture that will seem foolish as its use becomes standard practice in every other private and public sphere. There is no reasonable expectation of privacy in any public setting - the power of digital technology doesn’t and shouldn’t change that simple fact. Facial recognition is little different from reviewing pages of mug shots or hours of surveillance camera footage - just faster and more efficient.

  147. I agree that the tech will be used widely one day. But I don’t think that day should be today. As several tech leaders have warned, it's too premature. There’s problems with it recognizing faces of people of color, a report by NPR said it could only identify people of color correctly something as low as 30% of the time. Imagine not being able to board your flight or being arrested for a crime you didn’t commit because the software couldn’t ID you or misidentified you as someone else. I think tech is evolving rapidly but we need to pause and ask serious questions about ethics.

  148. I'm sure the state of Alabama, Georgia, and other states law enforcement would get great use out of this technology to track doctors and woman to enforce abortion laws. Certainly no bias would be involved. Welcome to 1984.

  149. Liberals run amuck. Let's limit the effectiveness of law enforcement and then in the same breath question the level of crime in the streets. Our European neighbors use CCTV for crime prevention by monitor neighborhoods, yet we find it too invasive an approach. If you want to clean up the problems and permanently fix a neighborhood, it's an easy solution. It's a high tech version of putting a cop on the corner. If you own a cell phone, drive a car or purchase a metro card, your movement is already know. This is no different.

  150. @George Facial recognition could help with the fare dodging said to cost the MTA a significant amount in lost revenue.

  151. @George "Run amok" is the way to write that.

  152. @George I fear street crime far less than a president who is constantly promoting political violence against minorities, protesters, political opponents, and the press. If Trump gets his way, he will be using facial recognition to track anyone that disagrees with him so they can be arrested and re-educated, like they do in China. Trump asked why American citizens don't "sit in attention for him," with "fervor," like the North Koreans do for Kim. The reason is that Kim uses mass surveillance and political violence to force these behaviors from his subjects. Trump "fell in love" with Kim and the head off Russian intelligence, who also uses his surveillance state to oppress his subjects. And yes ours obviously jealous when he talks about Xi being "president for life" in China. Read the history of Operation Condor if you think government surveillance of citizens will keep you safe.

  153. Why should the CIA or some intelligence authority care what law are passed in San Francisco?

  154. San Francisco bans technology used to fight crime.

  155. Very alarmist article. The issue is surveillance cameras, not searching for what's produced by them. AI, or more accurately machine learning, does in minutes what officers and detectives would otherwise take hours of mind numbing viewing of videos to find. If you don't want to find words where a suspect changes his the story behind his alibi, if you don't want to easily transcribe the audio and produces sub-titles to make it easy for a jury to comprehend and interview, if you don't want to see quickly who dropped the backpack with a bomb in it. Don't use AI. Let officers suck up hours of their time trying to do what AI can do in minutes, find the needle in a haystack.

  156. Considering that SF has the lowest successful prosecution rate for murders of any major US city you might expect better but there is no money in catching killers. On the bright side the traffic cameras are back in force.

  157. "Warning that African-Americans, women and others could easily be incorrectly identified as suspects" Strange description: isn't African-Americans, women and others a whole society?

  158. No big brother. Maybe it could be used per court order on a cases by case basis, say, in searching for fugitives or kidnap victims.

  159. San Francisco: playground of the criminal kind and no one will recognize them.

  160. @Charlotte Thank you for posting this. What's happening in China is terrifying and anyone who thinks this technology isn't going to be exploited for political purposes here in this country is hopelessly naive.

  161. These technologies of identity and control produce maleable data and are used against the people. You might feel all cozy and comfortable while they put others in cages, until you or your kids are caught in a data harvest. Now you realize, too late, that you are a bug about to be crushed by the machine you praised.

  162. This was an important crime fighting tool. There is no reasonable expectation of "privacy" when you out in the public. This is another outbreak of political correctness. I used to think San Francisco was a cool place to live. It is now in a neck and neck race with Berkeley to become the laughing stock of America.

  163. @styleman I used to think San Francisco was a cool place to live until Silicon Valley imposed their suburban values on us natives and turned SF into their bedroom community. Why don’t you encourage San Jose to use facial recognition technology and then just stay there please?

  164. Good. Um, has anyone noticed the whole techno-control state dystopia the Chinese are putting together? Mass surveillance coupled with facial recognition is a key part. If you think it can't happen here, you are kidding yourself. I'm all for law enforcement but those willing to sacrifice liberty for security deserve neither.

  165. @Sci guy Right, because China was a total Civil Rights utopia before! Without all this nasty technology, we had DJ Mao and the Tianamen Square day party. Everything was so fun and free! But now because of evil technology, China has suddenly become a totalitarian state...

  166. @Sci guy This fact about the Chinese system is in direct contrast to my comment on the generally innocuous British surveillance system. The Chinese system in the hands of an authoritarian regime is a nightmare come true. I’ve also been to China. I saw cameras everywhere, like I did in Britain, but the feeling was 180 degrees the opposite of what I felt seeing them in Britain. To dispel the very real chill, I simply smiled directly at them, even giving a tiny wave to the one seemingly monitoring the pores on my face as I bought a subway ticket from a machine in Shanghai. And I clutched my American passport a little closer to my heart.

  167. @Sci guy good job pointing out the state-of-the-art in population control via technology. Freedom of thought and democracy cannot co-exist with paranoia that will grow as abuses and technological intrusion and spying improve. We are already self-censoring online like crazy and you are probably crazy if you do not. Now add to that self censoring your movements and associations for fear of being identified. Then add to the problem targeted mis-identification and falsification of image records made possible by DeepFake AI tech and where false accusations are backed up by video and AI "evidence" that can be created by any enemy. We can see the contours of a nightmare dystopia forming as we speak which is why SF needs to take the lead and set the example for FREEDOM.

  168. Here in SF my local grocery store has all the liquor on recorded camera and behind lock and key and it can only be handled by an employee who takes it directly to a cashier. A uniformed policeman with a patrol car stands at the entrance to the store most of the time. These face recognition cameras would be a tremendous help to prevent the loss of liquor to thieves and make better use of our limited police presence. It really is up to our business community to bring some common-sense best practices and pressure the crazy board of supervisors and our lame mayor.

  169. I lived in SF in the early 90's and thoroughly enjoyed the tipped balance of left and right -- now it's just left. Sanctuary, high personal crime, robberies and car thefts, along with an an exploding homeless issue turned SF into a city I avoid. Its difficult to shop, dine, or simply walk downtown in peace. Now they turn their backs on the hand that has fed them for the last 20 years, Tech. If any city needed new technology to identify criminals and reduce their crime rate, it's San Francisco.

  170. @fourteenwest you have a point, but the threat of state-controlled surveillance on this scale is too high and incompatible with freedom. however: personal and corporate tech is fair game. let your imagination run with that for a bit.

  171. This seems like a plausible tool to help w/ authorities on so many levels to identify criminals. I would have thought a nay vote would have come from Southern California where plastic surgery runs rampant.

  172. Facial recognition is used by NYPD. The use of it does not give probable cause to make an arrest. When someone has been identified through its use, a positive identification can only be made by the victim and/or witnesses, and the procedure for that identification has enough safeties built into to it, that it almost never results in a false ID. Nationwide, in big liberally-run cities, there has been a push for decriminalizing certain misdemeanors, and giving murderers parole, particularly those who murdered police officers. This is just another example of defining deviancy down, as Sen Moynihan called it.

  173. The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. The above is the 4th Amendment. This applies to facial recognition software in used public spaces by the State.

  174. @The Falcon No, it doesn't apply at all.

  175. @Jackson : I have to disagree. I find it unreasonable to be subject to my face being recorded and checked against a database if I simply want to take a walk in public. There is no valid reason for that search.

  176. “Police, and other Agencies”. How about private companies? Seems to me any private company could simply bypass that law and just provide the results to the agency next door. If a city wanted to ban Facebook in their community, wanna bet a lawsuit wouldn’t follow. If cities and the ACLU are concerned about abusing privacy, what about Facebook, Google, and Amazon who sell private user information as a business model? They are listening to and tracking our every step. Bet San Francisco wants to stop that?