Nov 09, 2019 · 83 comments
Common Sense (Brooklyn, NY)
That tech companies have the ability to identify and track images and videos of child abuse - sexual and otherwise - yet do little or nothing about it is absolutely enraging. We need to totally re-think how we treat these tech companies in line with the brave new world we live in, if we can really consider it that. Social media platforms such as Facebook and cloud storage firms such as Amazon, Google and Microsoft are providing their services as corporate entities under various public laws. As such, they are akin to agents of the state and thus should have an obligation to track and turn in all IP addresses that have searched, used, downloaded or passed on such vile material. Those users should be then prosecuted to the fullest extent of the law, including having all internet access denied to them. Further, countries that don't adhere to US standards regarding policing child abuse (think Russia and China at least) should be shut out from having access to the US's internet system other than through secure servers managed by the government. Would this be considered an invasion of privacy and an undermining of individual rights? Some would probably say yes. But, when your actions are so antithetical to community standards then you have foregone those rights. If there were harsh laws requiring these multi-billion dollar tech companies to follow such standards or face huge penalties, even bankruptcy, just see how quickly they would clean up their act.
Cleota (New York, NY)
Let me get this straight: Google, Facebook, Amazon, Yahoo, AOL, Microsoft and the other tech companies spend millions of dollars spying on us, collecting data on us to sell to advertisers, and then they cite "privacy concerns" when it comes to taking down sexual abuse of children? The tech moguls are as complicit in children's abuse, and little better than, the poor excuse for humans who actually perform and disseminate it.
Software Programmer (New England)
Nothing in this story surprises me. The internet is broken. And the wunderkinds in Silicon Valley are in deep denial regarding the monster they have created. All people have the capacity for evil. The Christians call this "original sin". Within their paradigm, the internet is a tireless devil. Using a more secular accounting, the internet unleashes the power of id while eclipsing the superego. I have withdrawn from the internet for reasons that (thankfully) have nothing to do with the horrors this article describes. Today's internet doesn't provide enough friction: time for thoughtful reflection. It's addictive qualities are designed to foster our innate tendencies toward narcissism and impulsivity. And, yes, the horrible sexual urges that too many people have. The internet has allowed us to believe too often that the norms of social restraint no longer apply. While I worry greatly about the risks to personal privacy posed by today's internet, this article points up the exact opposite: the privacy facilitated by today's tech giants enables evil people to do unspeakable things. I have come (sadly) to believe that there is no way to balance our right to personal privacy in our connected lives with the need to protect people from the horrific abuse that the internet increasingly facilitates. The only real solution I have found is to disengage. This is what the culture of "move fast and break things" has created.
Tom Paine (Los Angeles)
Head in the sand approach never fixed anything. Paraphrasing Edmund Burke, "The only thing necessary for the triumph of evil is for good people to do nothing...." in the face of clear and present threat to our nation, our chidren and our world. What is required is the heroism of public citizens. As the FCC once appropriately regulated our media with rules such as the fairness documentation and limiting the concentration of "news" and broadcast papers and stations, so must these behemoths by regulated, broken up and anti-trust be brought to bare upon a US that is now an plutocratic fiction of oligarchy, mirroring many of the worst characteristics of fascism. At times our nation has actually been a beacon of hope and moral leadership as we were (in part) when we defeated the fascists in Europe along side Britain, Canada, France and the Polish people among others. We showed a moment of great possibility when we defeated the practice of slavery during the Civil War. If we've come to this Earth to go gently into the goodnight of the barbaric nature of human nature, then why have we come here at all? That people like Jesus, Mataji, Krishna and Mohamed have shown the way of peace, and moral authority born of universal Love, there is hope for this species and it begins with each one of us. What greatness is possible that we through our courage and persistence might create through actions and a willingness to fight the good fight for a better world?
Earth Citizen (Earth)
Along with these monstrous abuses, the public displays of the White Pages and numerous "people search" websites publish names, residential addresses, relatives and friends without their permission! I wonder how many women have been murdered by the abusers they fled as a result. Unbelievable! Pre-internet a person could a bit extra for unlisted number and be safe. Now it doesn't matter where you live your personal information is splattered all over these search websites. Where is the regulation?????
RT (WA)
One has to wonder at the resistance by the major tech companies to do their own policing. What darkness is within these organizations that convinces them that obvious sexual exploitation "did not meet their threshold for removal" ? Since it seems to be in 3 major tech businesses, one must question if there is some kind of money involved in resisting and pretending innocence in the face of blatant abuse. It's worth some deep journalism and/or FBI investigations.
Phyliss Dalmatian (Wichita, Kansas)
Yet another reason this Boomer is a proud semi-Luddite. These “ Tech Companies “ could shut this completely down tomorrow. Why don’t they? MONEY. Shame on them, except they are truly shameless. Masters of the Internet, minions of Satan. And I’m an atheist.
Anon (New York NY)
This is all I can think about every time that Zuckerberg opens his mouth. There are tech products mentioned in this article that I cannot do my job without -- Dropbox, mainly -- but I'm glad that I've quit all the others. Every time you go onto Facebook, you need to remember that you're funding this behavior. You're paying -- by letting them track you around the internet and sell you ads -- for this abuse. In another article, the Times reported that when Facebook gets a subpeona about these videos, they respond criminally slowly. They don't have the will to stop participating even when the government DOES provide them with evidence.
Barbara Tetenbaum (Portland, OR)
I just started reading this important article by Michael Keller and Gabriel Dance's but was stopped by the second paragraph's claim that "10 years ago their father did the unthinkable: post explicit photos and videos....". Wouldn't the "unthinkable act" would be the drugging and rape of his own daughters?
Jonathan Katz (St. Louis)
The method described only identifies pictures that reproduce or resemble pictures previously catalogued as pornographic. If AI were as good as its promoters claim, it could distinguish a new pornographic image from an innocent image. Can it? Is there any prospect of that? There apparently is a big training set.
Orphia (Australia)
It's disturbing how phone predictive text gives you words you would never use. e.g. Typing, "I saw a nice"... and predictive text suggests "blonde" or "brunette". I'd rather not see what it does if adult content is on.
MM (UK)
'One image captured the midsections of two children, believed to be under 12, forced into explicit acts with each other. It is part of a known series of photos showing the children being sexually exploited. The Canadian center asked Google to take down the image in August last year, but Google said it did not meet its threshold for removal, the documents show. The analysts pressed for nine months until Google relented.' Unbelievable and sickening. Great reporting from the NYT.
RJamison (New York)
I recently watched the PBS Frontline episode on AI. Major advancements in facial recognition with AI were very scary. However, it strikes me now that AI would be a powerful tool in combating the scourge of child exploitation.
Celeste (CT)
These are the types of things our Government is supposed to be doing something about...standing up to huge Tech companies to protect the population... Any yet, they do nothing. I truly believe it is the end of the U.S. as we know it. Greed and partisanship seems to be the only thing going on in Washington.
Z (Massachusetts)
This is why the NYT - and great journalism - is so desperately needed. Thank you for a thought-provoking and infuriating piece. The Dropboxes, Microsofts, Googles, Zooms of the world need to be held accountable.
Therese Stellato (Crest Hill IL)
Hackers should step up to destroy images as soon as they are produced. There are people that know how to combat this. Step up and help. To the victims my heart breaks for you. We all have to rise up to protect the innocent. One thing I learned is most abusers dont get caught in the act. Report what you suspect and dont wait and wait until you see it happen. Believe the kids, ask the kids, give them time to talk. They dont have the words to describe it.
Mark Thomason (Clawson, MI)
I had a client, who as a young girl had been abused sexually. She made me see a different aspect of this problem. Some of how we treat the problem and its victims actually makes things worse for the victims. Our relentless focus on the wrong done to them can make it difficult for them to escape that, just to be normal again. Her family had come to me to sue, to pursue the guy even in prison. He certainly deserved it, if anyone ever has. Yet the girl begged me to let it go. She was in tears, that all the "help" was well meant, but was only making it worse. So I honor the effort depicted here to remove pictures from their forever existence on the internet. Yet at the same time, I caution, our real focus must include the victims. They need more than an avenger. They need love and acceptance and understanding. They need to be something more than a victim. Don't let our righteous efforts obstruct the love they also need. Be aware.
Alec Dacyczyn (Maine)
If the badguys can find it then so can the police. If tech companies try to detect block the open exchange of this stuff then the badguys will know that they need to change their methods (employ encryption). Then the detection mechanisms wont work anymore. And the police won't be able to detect it either. So the solution isn't to stop the exchange of this material at a technical level. It's to find the badguys and put a stop to them. Don't encourage them to get smarter.
R. Spencer (New York)
I’ve re-read this series and I noticed that there’s no real mention of Section 230 of the Communications Decency Act of 1996. This legislation basically absolves ISP’s from the responsibility of policing their products for things like these types of images. What it says, in a nutshell, is that unless a specific image is reported to the ISP they don’t have to do anything about it. Sure, they may know that they have this problem on their system but unless someone points out a specific instance of it they don’t have to take action. And if they are notified about an image they only have to remove that image. They can ignore the rest until someone reports them - if they ever do. This is probably the reason why Google, and the other ISP’s mentioned, are slow to act. They know they can safely hide behind Section 230. There ARE hefty fines in place for not acting to take down the reported images but I never heard of the fines being enforced. I think that until Section 230 is re-examined, and amended in some way, this problem will only continue to get worse.
Pat (NYC)
Seriously, Elizabeth Warren is right that these companies need much more oversight. Let's start with big fines (million dollar type fines) for every occurance. Pretty soon Facebook would be a decent platform where people share pics of their cats and vacation.
Cristina (USA)
Thank YOU NYT! please continue to talk about it, over and over again. It makes our every day work more powerful, your advocacy together with ours will help take these criminals down. Its not only about the pedophiles, the tech companies are bystanders.. they are letting this happening over and over again
Northpamet (Sarasota, FL)
This is a heartbreaking story -- but so many people today are putting THEMSELVES in this situation by sending nude etc. pictures of themselves around. Who knows where those images will end up? These women were victimized. So many people who do sexting, etc., are victimizing themselves. You might find it fun today. But can you guarantee how you will feel in 20 years? Who knows when you might be up for a judgeship? Would you want your grandchildren looking at nude pictures of you in 40 years' time? Your business associates -- next week? On the Internet, you have no control over anything. Period. Assume everybody will see everything. The people in this article have no choice. You do. Never have a picture taken you would not want in the newspaper. Just don't. Don't.
Alisa A (Queens, NY)
Why can't the sites install detection upstream from any encryption and automatically delete any flagged material -- all without any human being ever reviewing it? This should allay privacy concerns. If some material is deleted unnecessarily, that's a small price to pay for cleaning up this disaster.
blub (somewhere unreal)
Just another transparent try to delegitimize strong encryption in the hands of the public. Don't let your right to privacy be taken away from you.
Greg (47348)
No brainer. Government owns the children if you claim them as a deduction on your taxes. If you do not claim your children as a dependent on the 1040(A), the Government has no legal rights to take ownership of your kids.
et.al.nyc (great neck new york)
If the Equal Rights Amendment was passed, and women were finally given full constitutional rights, would someone then be able to take legal action against all forms of public media for criminal behavior? ( Internet platforms, movies, cell phone carriers, movie studios, and book publishers?) What of the rights of children after birth, too? Should there also be an amendment for equal rights for all living individuals? The concept of a child as human has changed vastly over the past two hundred years. Still, in New York State, a child may be referred to as "infant" in court documents until 18 years of age. What should a Childs Right to Life Movement be like? Who will lead? The Clergy? The Women's March? Is it enough for adults to "Protect" children under puny existing laws, or should living children also have equal rights for life and liberty under the law? No one can perform a criminal act against a multinational corporation, so why a child? Would an "equal rights for children" law then allow us, as a civil society, to hold all forms of media (and yes, even movies) accountable for horrific criminal behavior and its consequences? Could these children then sue our cell phone carriers for allowing illegal access to those images? If those tragic and horribly abused kids had some financial recourse, can we imagine how fast ISP's and cell phone carriers might become proactive? Would we then need to wait for our dysfunctional Congress or ineffective POTUS to act?
Me (wherever)
Such aspects of the internet in which abuse has taken place have allowed for convenience, commerce and profit for the tech companies when used as intended, but the harm that has been and will be caused far outweighs the benefits. These aspects of the internet should be shut down until, if ever, the companies are able to properly police them. Businesses and individuals got along fine in the past without them and can again. It is a small price to pay.
JSBNoWI (Up The North)
This is one of the most disturbing—and frustrating—stories I’ve ever read.
AR (San Francisco)
These are horrific crimes, and reflect the perverse nature of this social system, which generates myriad forms of sociopathic and violent behaviors arising from the cruelty inherent to capitalism. The idea that a system that massively profits from child labor of tens of millions, child hyper-sexualization, misogyny, violence, and cruel indifference is actually interested in ending child abuse is dubious. More likely the feigned concern for victims serves as a stalking horse to encroach on democratic rights. This is a call for all our private photos and videos to be scanned by companies or the government. It may sound reasonable. However, would we agree for the government or private delivery services to open all our mail, just in case we might be sending horrific images? Or would we agree to have all our homes searched? We should recall the many cases of 'child pornography' charges brought against parents for taking photos of their small children including when they were bathing, and then going to printing services. This is a dangerous power to cede. And when the government demands to censor images or messages they consider subversive, such as protests against police brutality, or against another cruel war, or immigrant rights, or for abortion rights to not very far? What of images of naked Vietnamese children victims of US napalming that have been banned under the pretext of child pornography? No, blanket spying is extremely dangerous, including to the victims of crimes.
Michael Bain (Glorieta, New Mexico)
Like anything else in human relations, one human will thumb-down the other in inventive ways no matter the "Technological Innovation". Social Media, for any real benefit it brings to anything, gives oppressors, predators, and enterprise owners a more amplified way to benefit from their pursuits. Social Media is just a more powerful platform for human abuse. At the end of the day, this is just about abusing humans for money, nothing new or innovative about that. We just, now, allow these human abuse innovators to become grotesquely wealthy off their "innovation" and "disruption". You tech people are truly sick. You innovate off the backs of others, just like our forefathers. You are only sickly clever, that is about it. MB
Tom Paine (Los Angeles)
This is why Zuckerberg and others who are willing to allow political campaigns to post blatant lies packaged to look like truth make me sick. They want to continue to steal your information and sell it for megabucks all the while abdicating any responsibility as the publisher they are. Hiding behind the fallacy of "we are just a platform." I'm looking for a means to boycott these companies advertisers and big users because these social media monopologies are worse than Standard Oil was before the nation began taking seriously and enforcing anti-trust and anti-monopolization laws. If these massive billionaire making organizations can not be ethical stewards of their good fortune, it is time for society to break them up and regulate as the publishers they are. I back Liz and Bernie on this. The others are just posers in my view.
Jill (Michigan)
That a parent could so betray their children is enraging. That is what the death penalty should be for. It's time for computer warriors to take on the pedophiles and companies that shield them.
Seth (Salt Lake City)
Stop expecting tech companies to have morals. Our elected officials in Washington could do something about this and choose not to. Why do we continue to act surprised when profits trump human value?
Adele (Montreal)
Beyond disgusting. Technology companies could eradicate this if they wanted to. It should be a condition of their being allowed to operate.
Eric (Texas)
It's strange that everybody is blaming the big tech companies while ignoring all the free porn sites who put out more videos and images that rival the tech giants. That's probably where the focus should be on. Most of the source material is probably coming from porn sites first and then getting posted to social media.
Rpasea (Hong Kong)
If NYT can find these files, there is absolutely no reason tech giants can't if they really wanted to. Shameful and time for a class action lawsuit to get their attention.
Adam (NY)
It's not just that certain websites don't filter content, some such as 4chan and even reddit know that content such as this exists but seem to actively turn a blind eye because they want the extra visitors... Don't believe me? Just browse these sites long enough and you're bound to find content like this and even have difficulty getting them to remove it.
BadMexHombre (Merida)
I became physically ill reading this report. It's unfathomable to me that there are people that are is this evil and demented who would prey upon children.
Cary Mom (Raleigh)
How about we do something useful and institute the death penalty for pedophiles? Without exception. That might be a good start.
Marsha Boone (Washoe Valley, NV)
As a long time professional photographer, child advocate and a personal survivor of child sexual abuse, I am stunned and speechless. The scope of this problem includes me professionally.
Chris (CT)
We have to fight this, maybe start by: 1) Imposing harsh financial penalties for platforms that host, share, transfer these images. If they are profiting from people using their spaces for this, they need to pay big fines. 2) Tech companies need to commit to a major initiative that scrubs all child porn from every virtual corner within their reach. If they can keep track of billing millions of people, they can keep track of this type of content. No excuses. 3) People who have been exploited need representation to sue these companies. Community standards no longer matter in our fast-paced income-driven ever-changing world, and sadly, in a world with many sick people. The only thing companies answer to is financial pain. Make these companies pay.
Joseph A Losi (Seattle, Wa)
I commend the Times for pushing this issue. I have 2 questions and a follow-up observation. First, is Congress considering any legislation aimed at holding tech companies to a higher standard, not a voluntary self-policing standard, a standard with legal and financial teeth? Second, is the NYT's willing to push further into the foundations of what psychological forces, both individual and relational, conjoin to produce this type of abhorrent and ill behavior. If the NYT has addressed this crucial question, I'd appreciate a link(s). My thesis is that Pedophiles are looking for some type of relief of their own embedded pain, perhaps from their own traumatic attachment wounds. Now with the spread of SM they are looking to be witnessed and condoned, in a fashion, by others, they feel a kinship with and support from. Have psychological profiles of men like William Augusta Byers being done? If so, I want to read them. Certainly, there are steps that tech companies must be forced to take to limit the spread of this vile content. But to limit our action only to triage misses what we can learn about how this behavior becomes manifest in our hopes of intervening at the source. This is not only a problem of stoping the spread or locking away the offenders but a societal problem of how we as a society play a roll in incubating this evil. In my humble judgment, that is a very tough question we must face into ourselves, with courage, honesty, and vulnerability.
Anna (Ohio)
This is horrifying. If a store owner were found to be peddling images and videos such as this, they'd be charged as complicit in the distribution of child pornography. Why is it that the tech companies get a pass when they have the ability to detect many of these images and videos? If they can detect it and do nothing about it, they're complicit. They should be ashamed of themselves and we should pass laws that would allow them to be prosecuted along with the individuals who upload or download this content. Side note: Please, parents, for the love of God, stop posting "cute" bath-time or bare butt photos of your babies and toddlers on the internet. All the privacy settings in the world won't stop someone else in your circle from screen-shotting your FB or Instagram and sharing those same images as masturbation fodder for pedophiles.
Maggie (U.S.A.)
And parents, for once, need to start raising better sons.
LM (NYC)
The examples of child abuse described in this article are beyond horrific. They are mortifying. There is an underlying sickness in the brains of these perpetrators. The fact that they want to post photos and videos makes it even sicker. I think the problem lies not only with the technology companies, but with the human race. Who is raising these monsters? How do they become monsters? Address that, then address the total ineptness of these technology companies.
Henry Ott (Las Vegas, NV)
If law enforcement agencies could spend yearly 1% of what tech companies or the Department of Defense devote monthly to research and development, untold misery would be alleviated.
Daphne (East Coast)
The Times is all over the map on digital privacy. How is this any different from a government or private agency scanning all photographs or other content to identify whatever characteristics they deem a threat or valuable? One day you encourage encryption and the use of VPNs etc to protect yourself. You commend Apple and other companies that take steps to protect users content. The next day everything should be scanned because one image in a billion or less may be illegal. Child abuse of all kinds and other terrible things that are documented and shared on the internet is a scourge yet the rights of the billion still out weigh the rights of the few.
rprasad (boston)
Kudos nytimes. This article will definitely fast forward big tech's efforts. They have the $ and talent to do much more damage control--to spend a fraction of a fraction of a side budget to be good citizens. (This summer, I remember talking to people who had worked on a small govt. funded project in this area, e.g. employing machine-learning to recognize these types of pictures--and their sources. Hours of working with "training sets" of images and constantly re-checking the photos to adjust the recognition code left long-time programmers and theorists in tears and despair--but it worked and these were small teams on shoestring budgets--nothing even close to big-tech resources.)
rprasad (boston)
Maybe not a "fraction of a fraction" of a budget but if the will is there, they can do it.
Maggie (U.S.A.)
Nothing changes until most, if not all, of society finally admits misogyny is the root evil everywhere, enabling the majority of males to excuse sexual predation, rape and even murder of females. It's unfathomable to most females this is so, even as conservative stats indicate 1 in 5 females has been sexually assaulted, at least 18 million right now living with that trauma - much of it memorialized by males on the internet. more than 82% of juvenile victims are female, more than 90% of adult rape victims are female. Again, increasingly more and more of for-profit on the internet, as well as for bragging rights of those boys and men. ANd those numbers are low, only reflecting the reported assaults and rapes. https://www.rainn.org/statistics/victims-sexual-violence
Charlie Messing (Burlington, VT)
This is horrible. With all that algorithm research I am SURE they could put an end to this kind of thing. If they can search for faces, can they not search for children without clothes? I have no idea, but I bet they could - and should.
Io Lightning (CA)
Basically all porn is harmful. Images of child sexual abuse is especially terrible, obviously, but it's also part of a continuum where we say it's ok to exploit some vulnerable people but not others. And, unfortunately, many people (let's face it: men) who get addicted to milder adult porn escalate into more gonzo and younger tropes. There is no reliable way to guarantee that even "amateur" porn is without victims, e.g. the posting of revenge and non-consensual porn. As a society, we need to stop saying porn is acceptable.
Chuck (CA)
Add this to the laundry list of irresponsible and derelict behavior by the social media tech firms. The only positive here is these disgusting people are using social media and as such could be caught and dealt with... if tech companies actually cared. The downside is as soon as the internet materially clamps down... these disgusting people will simply pursue other channels. That said.. they have been present on the internet since it first went live.... so they are as persistant an issue as cock roaches.
Rob-Chemist (Colorado)
This article highlights one of the massive contradictions of the internet. On the one hand, we demand enhanced privacy protection (encrypted messages/data, do not follow our links, etc.). On the other, we want, and rightfully so, this sort of garbage removed from the internet. These two goals directly oppose each other since enhanced individual "privacy" makes it virtually impossible to remove the latter. It is unclear to me, and probably to the various internet companies, how you do both simultaneously.
SYJ (USA)
The depravity of these people, our fellow human beings, is soul-crushing. I weep for the lost innocence of these poor children.
JES (Des Moines)
This is a horror show. It is beyond comprehension that there could be something done about this and it's not being done. It is beyond comprehension that there are men as sick as the ones described. I pray for the victims. My heart goes out to them. I hope deeply for healing. It's not your fault!
Andy (Salt Lake City, Utah)
Is technology enabling childhood abuse or is technology simply making us more aware of childhood abuse that already existed? You'd never know that father abused his children if he hadn't posted the video online. I'm inclined to say this isn't a strictly technological problem.
Maria (ny-ny)
And your inclination would be wrong: the torture/abuse described in this article has a market- made exponentially larger by the internet.
Nick (California)
As a westernized society built on notions of inalienable rights and free speech, we need to have a hard reckoning with broad notions of free speech and liberties. Unless we want to live in a surveillance society similar to China, we need to redefine what it means to be a self-governing people. That means people are going to have to ask themselves if every little fetish or sexual fantasy they have is permissible. The internet is a great big collective id. We are going to have to learn to control ourselves. All this talk of searching for adult porn on the internet leads me to think that it is just a slippery slope to these horrors inflicted on children. Don't tell me it's not. It is. People's consumption of porn and our overly sexualized society feed these degenerate behaviors. Until we accept this, we are either going to live under the "tyranny" of a twisted notion of freedom or a real life autocracy. These crimes are so deplorable. My heart breaks for these children and I can't believe the thirst I have for the punishments these real life monsters deserve. It scares me what I want to see done to them. We need to grow up as a society. And quick.
R. Spencer (New York)
It’s good to see that the Times is still focusing on this issue. It needs to remain in the forefront in order to get the attention it deserves. It appears that tech companies are still behind in their efforts to police these images. I can only speak from my experience but I know for a fact that AOL did scan for these types of images. I ran the scanning process for eight years before being laid off in 2015. We used PhotoDNA but not as our primary scanning tool. It was unreliable at the time and we got false hits from it. Our process, called IDFP, differed because it scanned for exact matches in our hashtag database. It also had the ability to scan for videos and did so successfully on numerous occasions. The images, along with information on the accounts that sent them, were reported to the National Center for Missing & Exploited Children who forwarded them to law enforcement. We then permanently removed the image files from AOL’s servers. Our efforts led to numerous arrests and convictions. This was the practice that was in place while I was there. After I left my understanding was that this practice remained in place. At least that’s my understanding from speaking to people who still worked there the last time I spoke to them. Since the company has been taken over by Verizon I have no idea what the official policy is. During my time there we also made our hashtag database available to other ISP’s including Google. My understanding was that Google didn’t use it.
RMD (East Bay)
Thank you for continuing to shed light on how Silicon Valley functions (or not) and the results of their amoral indifference on our politics, public discourse, and on individuals. Why are we allowing this to happen? Do most people not realize that "Silicon Valley" is extraordinarily immature, as a culture? That so many of these companies have a culture of willful amorality (cloaked in libertarian beliefs and beliefs about personal immortal power). That narcissism pervades the place? That the people at the top are at times flat out antisocial? I lived there for a long time. The decent people, which of course is the vast majority, and prosocial impulses are overwhelmed by those who are really in charge. Yes, I know there's a show that makes it all seem funny and charming. Perhaps that's why the show exists. But individuals don't really matter - it's the gestalt of the companies, and what they actually do that matters. How are we all feeling about that? Again, why are we allowing this to happen?
Carl Stephens (Washington State)
When I upload a picture I have taken facebook knows if the picture has people, a person, a bird, a bridge, a car, etc. The technology exists to identify what a picture contains. The tech giants have no excuse to let this happen.
Cathleen (Virginia)
I wonder at a group of tech industry policy makers meeting to discuss, basically, what issue was more important: customer privacy or the safety of children. There should be no debate...but there is.
Grebulocities (Illinois)
One of the many horrifying implications to me is that millions of us have probably unwittingly seen some of these images when we have searched for adult pornography. Most men with an internet connection, and a large fraction of women as well, search for porn on a regular basis. Of all the pictures and videos of amateur young women, some fraction is going to be of girls under the age of 18 who are being passed off as 18+. I had thought that this fraction was quite low - low enough that people not actually searching for underage images would be unlikely to run across them - because image-recognition technology should do a fairly good job of detecting dubious images when they are reported. It appears that not only is this not the case, but search engines at times could even direct us towards that content, with Bing likely being the worst in this regard. Most porn consumers - which, again, is most of the men and a large fraction of women who use the internet - have probably viewed child pornography of girls in the 14-17 age range, possibly on a regular basis. No search engine is doing much more than a token effort to remove even the most flagrant child pornography, let alone the sort of thing that users would mistake for legal-age porn. We - myself included - need to be a lot more careful about porn image and video searches.
Robert (Seattle)
Grebulocities writes: "... with Bing likely being the worst in this regard ..." We actually don't whether or not Bing is the worst. Our best guess is that sexual predators are making use of all of these online services, the more secure the better. In this case, one particular program that the NY Times created, with a small set of known images, did indeed find problems with Bing. On the other hand, the article notes: "But separate documentation provided by the Canadian center showed that images of child sexual abuse had also been found on Google and that the company had sometimes resisted removing them." Because Google and Facebook, along with all of the companies that they have purchased, are essentially monopolies, with monopoly market share, this problem on those sites is, in scale, almost certainly several to many times larger in magnitude than on Bing.
Laura (NYC)
To confirm your point, recently a 15-year old who had been reported missing was seen in 58 PornHub videos. Here is the story: https://meaww.com/missing-teen-adult-video-pornhub-modelhub-snapchat-periscope
A (Bangkok)
What is the basis for your assertion that "most men with an internet connection, and a large fraction of women... search for porn on a regular basis." ? I would imagine it is more like the finding in addiction studies. Only 10% in any society succomb to a drug addiction. Are you one of the 10%?
Steve Smith (Easton, PA)
There are not enough millstones for all of these necks.
Kelsey Rodriguez (Saint Paul, MN)
What can we do, as consumers and citizens, to help hold these companies accountable and bring these criminals to justice?
n (a)
Tough read. Good reporting and analysis.
Scott (San Mateo, CA)
This heaps plenty of blame on tech firms for not deleting images and video, but not a single grain of blame is placed on law enforcement. Even if the tech firms develop near-faultless methods, these people ultimately need to be arrested. That clearly isn't happening.
Sarah (London)
this article talks about the investigations that these women's photos were found in just this year so obviously there are ongoing investigations. but if one pervert sends photos to another via encrypted apple message how would the police possibly know?
Scott (San Mateo, CA)
When these technologies didn't exist, and people were mailing photographs and VHS tapes around (something that still happens with DVDs), no one seemed to feel the police were doing a great job catching people. Yes, lack of evidence can be an issue, but it's also just not always a focus.
Melanie (Columbus, Ohio)
The funding for law enforcement investigations on this scale is simply not there.. There's also a very high rate of burnout, as you might imagine. Another solution -- besides the obvious that the tech companies abilities to id photos -- is that they contribute a minuscule percentage of their billions to fund law enforcement's prosecution efforts.
L'historien (Northern california)
great reporting. thanks.
M (Dallas)
I tried to blow the whistle on this approx 12 years ago working for a major SV startup run by a person frequently reported on in the NYT and other outlets. My concerns were brushed off. I was the one responsible for reporting images to the Center for Missing and Exploited Children, and also responsible for the deletion of the content on the site. Every time I said something more needed to be done, I basically got a non-response from leadership. It got to the point where I was recognizing the same children again and again in the photos being reported to me. Believe me when I say it is rampant and has been for many years in the Valley and startups all over the world.
Alex (West Palm Beach)
Wow, what an article. I’m stunned. Perhaps a non-appealable death sentence for these perpetrators would slow it down. I can’t think that anyone caught doing the described horrific acts contribute anything worth preserving in our society. Sometimes the actions are bad enough to forfeit all rights to live among others.
RealTRUTH (AR)
This is a fantastic article, explaining the process of digital image detection. I wish it did not have to be used for such disgusting content though. Since it is obvious that we possess the ability not only to identify the image but to trace its origin, EVERY source from which these originate should be prosecuted to the fullest extent of the law, in EVERY state and in EVERY country. The deviants who produce and disseminate this stuff are the lowest of the low - Jeffrey Epstein-class. To some this is an "illness" that requires treatment, to many others it is simple perversion and total lack of conscience or morals. Either way, these people must be identified and taken out of society to a place where they can never do this again. ALL ISPs and web sites found to be purveyors should be heavily sanctioned and, if appropriate, shut down. There are limits to even our First Amendment (and, by the way, to the Second Amendment also).
Craig H. (California)
If the rate of prosecution does not keep up with the rate of taking down content, the problem will not resolve. All criminals will be using a alias to setup accounts. Shutting down an account, with no consequence, will only result in a new alias for a new account. Blocking a single image/video, with no criminal consequence, doesn't remove the original image stored on someones personal disk. If I remember correctly from a previous article, the resources devoted to criminal investigation is incredibly small, and the criminal investigators have far more data (images and their ip addresses) than they can process - so the sift through looking for the worst stuff. Then a single investigation will take months. If congress would devote X billion dollars/year for a budget to set up a specialized police force with many thousands of officers the probability of getting caught would go up and sharing would go down. Hopefully there would be a critical point at which sharing virtually disappears. Although that wouldn't delete data on privately owned disks or stored in unshared accounts. But even then, the spores would remain. For one thing this a borderless phenomena.
David (Flushing)
A number of porn sites allow persons to post videos. There have been cases where underage children have done this showing themselves. Often there is a button where one can report an image and it is removed. I suspect many porn distributors do not want to be involved in child pornography. There are, however, websites that deal exclusively with this.
Angry Woman (Bethesda, MD)
We are so done with Google. They are becoming the Evil Empire. Of course, they are not alone.
Earth Citizen (Earth)
They just took over Fitbit too. And you know what else? The Fitbit community when it first started a few years ago was really made up of avid exercisers. This past year I noticed that unathletic men are using Fitbit to contact women. And now Fitbit being taken over by Google is the final straw. Where are the regulations of these predators (corporations and individuals).
Alan Burnham (Newport, ME)
This non action by tech companies is criminal. Time for Washington to do something, make inaction crime because it is!