YouTube, the Great Radicalizer

Mar 10, 2018 · 100 comments
Studioroom (Washington DC Area)
I think the important question is, what is the motivation for creating all this phony content and posting it, a great deal of it, to YouTube? Perhaps the answer is to steal elections.
Tim Pat (Nova Scotia)
YouTube, "social media" and the internet in general have brought us squarely into the global village envisioned by Marshall McLuhan long before the advent of any of these electronic phenomena. It demonstrates the downside of "freedom" as promoted particularly by the right. We may not be able to shout "fire!" in a crowded theater, but we're free to set fires at a slight distance and allow the toxic smoke to waft into our homes and workplaces.
Alan (Toronto)
As someone who has worked with computer classification tools that are a component of things like YouTube's recommender algorithm, I suspect there might be a simpler, and naive, though not necessarily any less insidious for that, reason behind this drive to extremes. If you imagine trying to group videos into a set of bins then you can see how you would probably put vegetarian videos in the same bin as vegan videos. Similarly far-right videos and other less extreme, but still right wing, videos would occupy the same or similar bins. What I strongly suspect that you would then find is that if you want to try and define the bin, you would find that it ends up being defined by the most extreme content, because that content is furthest from the other bins. If you start watching videos that belong to one bin, the algorithm will then naturally tend to lead you towards other videos that best fit it's idea of the archetype of that bin, i.e. the most extreme ones. There doesn't have to have been any decision on the part of YouTube/Google to favour extreme content because it leads to longer viewing times. Though if extreme content really does naturally lead to longer viewing times then any algorithm that also factors in viewing time will clearly also tend to favour extreme content. The problem might well be in part that although these algorithms are complicated they are still, compared to a human, quite simplistic in how they look at things.
Josh Rubiin (New York)
Thank you for writing this article. I am not immune to the effect you describe. Machine intelligence has deduced ways to drive my attention, not just attract my attention and hold it. Who is designing controlled experiments to measure this effect? We need strong evidence. Where do I go to discus measurement and counter-measures?
Antoine (France)
I would like to develop a few ideas about this article. First of all, I think that the author fails to consider a crucial point in this article. In fact, since Youtube is an opened-platform with no edition constraint, we should take into account its main actors : users and editors. And the radicalisation phenomenon, seen from this angle, underlines the shifting ideas of the society (video editors), rather than a biased algorithm. In fact, the machine is just leading users to trending or related videos, edited by humans. Thus, the radicalisation of contents empathises the fact that people are more and more attracted to extreme ideas. Nevertheless, this does not whitewash the responsibility of Youtube as the main vector of these radical ideas. In fact, the platform and more generally the Internet is a revolutionary technological object. It allows people to interact on a global stage, without any control or restriction. Youtube is doing no more than exploiting this gigantic traffic to make money. However, since it is in a monopole situation, it has gained an unprecedented influence and power over our society, and as this article shows it, is now a major actor of the political life. Like the author greatly shows it, it is a very concerning matter to leave such power out of regulation.
Jeff (Evanston, IL)
It all comes down to critical thinking. Our schools are failing us in this respect. When we see a YouTube video or any other video or any news article that makes suspicious claims, we should ask ourselves: Can this be right? Instead, a good number of people gobble it up and move on to an even more extreme version. Yes, the internet can feed this behavior, but it can also serve as an excellent tool to find out the truth. Critical thinking and research for facts should be a required subject in schools.
Joe Ryan (Bloomington, Indiana)
There's some fascinating research to be done here on what society conceives of as extreme! Is there a transitive ranking for each topic? Or does every topic have a black hole that YouTube move you to quickly, skipping over intermediate steps? When you watch videos of Haydn performances, do you get recommendations of Mozart, but not the other way around? And then if you watch Mozart and follow the chain of recommendations, where do you eventually end up? (Repeat for other domains.)
DWS (Dallas, TX)
The very features you object to, digging deeper into subjects, has a huge value in, one might argue all, subjects. For instance, you have a question about repairing your broken dishwasher, without too much effort you discover several videos discussing common failures and their resolution. You quickly identify the failure and realize you can repair the appliance yourself and for a tenth the price of engaging a repairman. The algorithms don't slavishly drive the viewer to increasingly more radical content. The viewer is not entirely passive, it's their selection of the next iteration of content that decides the investigation. This is just as true for computer issues, ordinary differential equations, hydroponics, sports and Monte Python as it is for conspiracy issues. You don't appreciate the video content? There's always that like or dislike button, as well as the feature to dismiss recommended videos prior to even selecting it. And for extreme content that should be banned? Report it.
Jon B (Long Island)
And this doesn't even touch on the massive copyright infringement Google/ Youtube engages in on a daily basis, taking advantage of a loophole in the obsolete 1998 DMCA. But Google has so much influence in the US Gov't, and in general, that it is virtually above the law.
Neildsmith (Kansas City)
Interesting experiment I suppose regarding auto play and the recommended content, but how many people watch you tube that way? I don’t. When looking at content online, we all have to make our own choices about what to read and watch. If you click on junk, well, who’s fault is that? If people really do rely on these programs to feed you “news” and content then I suspect that is the real problem. We all know how to channel.
ChesBay (Maryland)
I actually avoid certain extreme sounding website video producers, even if the content sounds interesting, because YouTube is somewhat intuitive. I'd like YT to just automatically offer whatever is news, today, and let users decide, and search for what they want. Not interested in censorship. No RT, and no InfoWars, for me, thanks, but I will choose for myself.
RWeiss (Princeton Junction, NJ)
Since I mainly use YouTube for music and videos of amazing animal behavior, I had no idea that its algorithms could be biased in this malign way for news and political matters. If these observations are valid, Google's famous governing maxim of "do no harm" is a big, big lie.
T.H. Wells (Los Angeles)
There's no surprise that the You Tube algorithm cited by the professor prompted Trump voters who watched videos of him to look at white supremacy websites, even though many Trump supporters are not that extreme -- shall we say, not quite that black and white? -- in their conservative beliefs. But it is simply wrong to state that a radicalized version of Clinton's or Bernie Sanders's views would logically morph into "...the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11." That sort of bilge is flowing from the same fountain that produced "Birtherism," talk of a Deep State, and so on. When leftists complain of CIA manipulations of foreign government, we are talking about things that actually happened. Certainly some on the left spin bizarre interpretations of reality -- this article correctly includes the so-called "vaxxers," who believe without a shred of evidence that immunizations cause autism. Some of us are prone to extremes in political correctness, and become intolerant while trying to push society toward greater inclusivity. There are other examples. But don't tell me that it was extreme lefties who thought up the brilliant idea that the Twin Towers were blown up by the government in order to justify greater military interventions in the oil-rich Middle East. That is called false equivalency.
emm305 (SC)
From the very first PC, our society has put everything & everyone in tech on a pedestal where they never deserved to be. In fact, I wonder if there are more sociopaths in tech companies than on Wall Street. Google, Facebook, Twitter all exist, like all media, to sell ads. They're no better than what we used to call 'the Boob Tube', but without regulation that all other media has. Europe seems to have a much better grasp of the inherent evil in how these companies operate & we need to pay attention to what they're doing to get control of them.
NS (Southeastern, PA)
There is an amazing idiocy in depending on what is popularly known as "AI" to make money. We are very far from achieving anything like true artificial intelligence because there is no evidence that computers, even digesting mass quantities of data ("big data"), are in any way "conscious". Just as actual intelligence depends on being conscious, the same criterion should apply to any form of machine intelligence. Alan Turing's "Turing Test"--that a machine that responds in a manner indistinguishable from a human (that determination being made by humans) could be said to be as conscious as a human. I feel that the Turing Test is flawed in that even if the output of such a machine were able to mimic a human's output, the machine by necessity reached its seemingly human output by completely different means than a human. That is, the machine would have to have real-world experience, be aware of a lifetime's worth of memories, and etc. , which at present is impossible to acquire. So an autonomous vehicle does not have "AI", but can operate only within a subset of conditions that pertain to driving. This would be more accurately called "computer-assisted driving". In the same manner, YouTube or NetFlix decisions are just human guesses implemented in computer programs. The rules could be based on anything because they are considered proprietary secrets even if they are contrary to the public interest. This is only one of the ways we are losing control of our society in general.
DornDiego (San Diego)
I wish I could send out a global message to consumers, the politically-minded and all who considers themselves intellectuals that said, READ THIS.
PogoWasRight (florida)
Speaking of the spreading of lies......by amateurs, no less.......As a longtime Liberal I have always been critical of Donald Trump. But recently, I have found myself being more accepting of his words and some of his deeds......mainly because he has ,simply, been doing as he promised.. And then today my beliefs all came tumbling down with the interview with a nobody called Nunberg......How Trump could allow this Junior know-nothing to speak for the Presidency is beyond me.......was he "ON" something? What an embarrassment, and he has re-claimed Trump's "nothingness". You Tube and the other social media Republican "position machines" should have caught him at the gate and shut him up. They did not. And now, with his so-called "rally" speech in Pa, Trump has once again crumbled into "word-nothingness"...........how sad for us, and the U.S.
hen3ry (Westchester, NY)
Given what we've seen on Amazon and other sites when it comes to recommendations this should not be a big surprise. Amazon tends to recommend more of the same no matter what we look at. If you are looking at books or anything concerning communism there is a push to get you look at more about communism. YouTube is following the pack. What I resent are the suggestions that if I found this I should look at that. Or when I look at something online and it shows up on my ads on the NY Times. I miss the card catalogs we used to have in the libraries. Google and other search engines are not nearly as accurate as we can be when we search. In fact I can think of many times YouTube, Google, Amazon, etc., have brought back results that were completely irrelevant but probably popular. Given the ubiquitous nature of tracking on the internet I find that I'm hesitant to use it for searches on certain topics. It's too easy for my curiosity (and that of others) to be misread by search engines and an assumption to be made that I'm a fanatic, going to assemble a bomb or worse. I've found the internet to be a stifling place because we cannot freely follow our intellectual interests without worrying who will decide we're a threat to the country. The algorithms do not lead to a better experience online. They aren't designed for that or to help us extend our knowledge. What a shame it is that what we really get is garbage for our efforts.
Tom Maguire (Connecticut)
Interesting. But a more benign explanation would be this: Lets assume YouTube simply tracked a random group of viewers of, e.g., flu vaccine videos. Most users watch a video or two and then revert to Cute Cats or Adorable Dogs, or whatever YouTube is best at. But a few viewers become deeply engaged, searching for more info on vaccines. And since few viewers will sit through twenty videos on the safety of vaccines, the most conspiratorially minded drive the "Deeply Engaged" search results. And there we are. Any simple pattern recognition, whether human or "AI", will notice the newly blazed trail to the rabbit hole and try to steer users towards it.
Frank Shifreen (New York)
Yes- finally a voice from the wilderness. I have noticed similar issues with YouTube. I was watching Jordan Peterson- Sam Harris videos at one point- and even though I did not subscribe- I started seeing recommendations to many other more extreme videos, even though I did not subscribe. I subscribe to science and tai chi channels but never saw recommendations for that kind of content. My alerts were flooded with such recommendations. I stopped watching. What does Google say about this? I thought their motto was "Never do evil"?
Pete (West Hartford)
Great restaurant analogy. As for inevitability, I disagree because the 'marketplace of ideas' cannot be regulated .... just as it's almost impossible to regulate he much fat, sugar and salt that restaurants and food manufacturers can serve. You have to self-regulate: and few people can.
Ikw (Washington DC)
YouTube’s algorithm can act like a virtual drug dealer, catering to my worst habits and instincts. For example, it predicts that I want to watch sports talk shows, even though I would try to avoid watching them if I typed in the searches myself. Once I’m in that loop, it can only get worse, not unlike drug addiction. Other times, the algorithm becomes a virtuous cycle of Beethoven to Bach to Debussy, MSNBC to PBS to CNN. Either way, it reduces my conscious decision making and inflates my echo chamber.
Geoffrey James (Toronto)
Some European countries are teaching media literacy in schools. It would probably be seen in the US as an inexcusable intrusion of the state
Paul Shindler (NH)
In physical wars of past, the killing technology advanced quicker than people understood it, and caused massive unnecessary deaths, as we saw in the civil war and later wars. Now, in the information wars, we are seeing the same thing. Hillary Clinton, after the recent election, correctly pointed out that Facebook, purveyor of Russian fake news, had a lot to answer for. She was right. Trump has proven himself to be the master of social media and the very effective cheap shot personal attack. He is our first Twitter president - that nobody saw coming. And as Michael Moore keeps pointing out - underestimate him at your own risk. As a recent Times op-ed pointed out, the Democrats need to quickly up their game in this new arena. The future of democracy is at stake.
JC (Oregon)
Well, I subscribed YouTube Red (no commercials) but I am still unhappy with the feeds. This is especially problematic (to me at least) on the small cell phone screen. You would think YouTube should leave subscribers alone. I just hope YouTube can be "smarter"! At least YouTube should ask me what I want to see before feeding me things.
Traymn (Minnesota)
Google owned YouTube is by far the worlds dominant video platform. Google search has more 80% of the search market, Google owned Android operating system is on 85% of the worlds smartphones. Perhaps the world might be better off splitting these up.
Rene Roger Tissot (Canada)
Excellent article. Completely agree with your arguments. YouTube is a lazy addiction...
Andrew Kelm (Toronto)
Maybe YouTube recommendations should be included as part of dating profiles -- "I like Stephen Colbert, documentaries on megalithic stone walls and long walks in the moon-landing conspiracies." Great article!
Richard (USA)
Whatever YouTube recommends, it's still up to the individual user whether he or she wants to watch it. This article is just one of countless symptoms of the real disorder - people continue to be 'informed' by less-than-reputable sources. Don't get your news from YouTube, Facebook, Twitter or the like.
Mary Rose Kent (Oregon)
I don't have a television, so I use YouTube to watch the news. Unfortunately, MSNBC seems reluctant to offer up Rachel Maddow's show, except for teeny-tiny snippets. Same for the spectacular Joy Reid.
John S. (Philadelphia, PA)
Is Ms. Tufekci recommending the thought police monitor and remove videos she deems conspiratorial? No, thank you. Fake news is free speech. Like it or not. "But what about the children?!" Regulate your kids access to the internet. The government is not a parent. You are.
W Rosenthal (East Orange, NJ)
This article was much needed to get people to think about how they are being steered down the garden path toward bad information while online, but I'd also like to point out that questionable editorial decisions are being made all the time by actual human beings in the news media. For instance, all the cable networks showed Trump's Pittsburgh rally last night in which he spewed, unmediated by factcheckers, his usual endless lies. Why do news organizations choose to continue to give a proven liar virtually endless free coverage? Did Obama's public speaking events get this kind of coverage? I don't think so. I guess he didn't lie enough.
mary bardmess (camas wa)
Why I hate Pandora: Because I really love Mozart, I want to hear Mozart. I don't want a random selection of Mozart-like music. I want Mozart. I'm not an idiot. I know what I want. I'm also busy. The grocery store does the same thing. The shoppers are enticed down isles that are blocked by special displays that slow you down. The share holders want you to enter hungry, having forgotten your list at home, wandering around picking up stuff that looks good. Those are the best customers. This marketing model is as old as civilization, but technology makes it ever more attractive and potentially addictive. Addicts are the sellers' gold standard for customer bases. Mindfulness is their enemy.
LBN (Utah)
So here is a YouTube version of journalism. Without any actual data, we learn that the average American is basically a dolt, subject to believing pretty much anything on YouTube. Does the Times truly subscribe to this dark view of humanity? Isn't this the sort of thing right-wingers say? And then in an attempt to support this viewpoint, we are subjected to an amusing dollop of pseudo-science, as bad as anything on YouTube. Do we really believe Paleolithic peoples experienced a drive state for processed sugar? While it is true certain deficiencies (eg salt) may trigger homeostatic mechanisms altering food intake, they are hardly "addictions." And we all know the changing science on fat intake and fat utility has continued to alter our modern view. While it is true that much of America receives its news from late night comedians (who the Times covers avidly, incidentally, and why not mention this?), shame on anyone who accepts Twitter, Facebook et al at face value.
CP (NJ)
YouTube's default should not be autoplay; that would help a lot. Autoplay is one of the worst so-called innovations YouTube has come up with yet.
Frederick (California)
Decades ago, when I was a computer science student I took a class in what was at the time a rather futuristic concept called 'Digital Media'. The professor was a published fiction writer with a background in journalism. She would often use a rather ambiguous phrase to warn us about the power of digital media; where decisions regarding aesthetic validity and social relevance would be determined not by writers, but by programmers. I paraphrase here, but it went something like this: "If the information you use to make your decision was given to you for free, you were MEANT to get that information." I pay for the information delivered to me by the NYT. YouTube info is free. I don't always agree with editorial decisions of the NYT. But at least there ARE editorial decisions. YouTube is just a bunch of programmers, and they suck at editorial decisions. I know this for a fact, because I am a programmer.
erikah (Mass.)
You Tube is a forum. Many valuable, informative and wonderful videos can be found there, as well as a whole lot of hooey. I liked the junk food analogy. The food industry has discovered how to lace our food with sugar and chemicals that make their food offering hyper-palatable and addictive. This has sucked in so many in modern society to a life of compulsive eating, obesity and ill health, while making a bundle of profits for the food and chemical companies. We all need to learn how to face the commercial food world and make some healthy choices in order to survive and thrive. The same thing can be said about our consumption of news, analysis and pop culture. Critical thinking skills are of more importance than ever. This higher order thinking is as rare as healthy eating, but more important than ever. Perhaps parents and teachers will be able to use the fake news environment as a teachable moment and raise a new generation that is more aware of propaganda and junk food for the mind. We all need to be able to ask ourselves, 'Is this true, real and healthy?' before we swallow it.
Kirk Bready (Tennessee)
I tend to be picky and critical so I go for months without seeing TV. But I often use You Tube for entertainment, mostly music and nature studies. I like the way it responds to my selection history and narrows its vast database to quickly suggest options in profiled categories which speeds my discovery of enjoyable new material. But it would never occur to me to allow that obviously biased algorithmic process to filter "news" and commentary, much of which I find highly questionable anyway.
vacciniumovatum (Seattle)
I have been ignoring YouTube's recommendations for as long as I have used the service to watch specific videos (e.g., I don't use it as a notable form of entertainment). Just the way I ignore ads in services that I cannot totally ad block. I thought everyone else did the same. Guess not...
A. T. Cleary (NY)
This is a fascinating and well-written explanation of the "recommended" feature on YouTube. It's also a bit creepy that it tends toward pushing viewers to more and more extreme content. That said, maybe it's as well to remember that you don't have to watch everything the algorithm suggests. Like TV, you can just switch it off.
Peter Schneider (Berlin, Germany)
Many extremist videos are readily recognizable, the same way a tabloid is recognizable. It's called media competence; the ability to analyze what we read, including formal aspects like language, presentation, separation of information and commentary, providing sources for claims etc. It is taught in good schools for press and literature. Media competence also for video and social media must become a mandatory part of the school education. We must prepare our children's immune system against things that try to get into their heads.
Paul Adams (Stony Brook)
The problem is with capitalism, and human nature, not with Youtube. From Google's point of view the more views the better - just the same as for the food, TV etc industries. The cure is rationality.
BarryG (SiValley)
Cool. The danger of AI is not that it will become malevolent, but that it will become wholly indifferent. This AI was just doing its job, not trying to destroy the social fabric, just completely indifferent to such destruction. Just doing it's duty.
ZA (Branchburg, NJ)
Seeking truth, finding truth, understanding truth has always been difficult. That’s why we educate. Before television and the internet all manner of content was produced in small newspapers and political pamphlets. Unscruplous salesman and politicians were ubiquitous. YouTube allows a modern version of hucksterism and is clearly not a reliable source for political content unless you are savvy enough to sort through the nonsense. I do like their recommendations for music videos though.
Josh Dougherty (Brooklyn)
I think this article is really misguided and ultimately a call for censorship. Is there a lot of "extreme" or radical content on YT. Of course. There's all sorts of content. It's an open platform for anyone in the public to share their views. This means you're going to get a lot of perspectives outside the mainstream, partly because people with "radical" views are going to be more eager than others to tell the world about their views, and because such people generally do not get a platform in more traditional media. YT is where they go because it's open to them, while other media aren't. This is part of why so many people like YT: you get to see a wider range of content and perspectives than you'd see on TV or other media. Also, the article treats "extreme" content as if it's some kind of poison that should be walled off from "non-extreme" content. YT is filled with all sorts of content, which means you're going to run into both "extreme" and otherwise if you're watching on autoplay. It also works the opposite way. I start watching an "extreme" video about a current news topic then pretty soon I'm watching a CNN segment on the same topic. This doesn't prove that YT is pushing "non-extreme" content. The content and recommends tend to be driven by what topics are covered (regardless of how "extremely") and what's popular with other viewers. It's largely user-driven, just like the content in the first place, as it should be.
David Gold (Palo Alto)
OMG! I have exactly the same problem. In fact, I deliberately watch some boring videos every now and then just to get the STUPID recommender on track. No, I don't think the AI program is being clever and keeping me longer on YouTube. It is just too stupid to understand nuance and to really evaluate my taste. I doubt that it will ever improve much.
iain mackenzie (UK)
Isn't this what advertisers and the media have been getting away with for decades? They have always claimed in their defence that "People have choice". But in truth, they invest billions in psycho-research to discover our vulnerabilities and how best to exploit them. They also rely on the fact that most of us choose not to choose. Happily for their investors, most of us choose to remain "comfortably numb" (Pink Floyd: 1979. "The wall") Hence, Trump.
Riccardo (Montreal)
Following, unedited, world events and movements on YouTube is foolhardy, and could lead you into very, very dark corners. My suspicions are confirmed by this story. Following YouTube-fed "news" is foolhardy because its "choices" are based on the number of viewers, not on content. One is led to think, correctly or incorrectly, that because whatever is on the internet and has had millions of viewers or "hits," it has significance. I'm afraid we at this saturation point in our tech history are now forced to recognize what AI can actually do TO us, its inventors. First, it has no taste and, if it has a sense of humour, which I seriously doubt, it's often very silly, or mean and mendacious. It lacks civility, manners. The working "attitude" of these AI algorithms is apparently, and primarily, selfish--do we dare say capitalistic and power-seeking--an attitude that has been embedded in its tiny mechanical brain. It CANNOT vet every submission to YouTube either, which is immediately obvious to anyone bewildered by the sheer number of totally mindless and often provocative YT "Recommendations." How could many NOT have been deleted just on the basis of common sense? In short, as they said years ago and it will always still apply: Buyer Beware, and now more than ever. We must be OUR OWN editors; we certainly can't depend on Google anymore. That's why it's advisable to stick to reliable sources like the NYTimes (Bravo!), whose editors will never be replaced by robots.
older and wiser (NY, NY)
No one is forcing you to follow their recommendations. Moreover, no one is forcing you to watch youtube at all. They have so much free content, from educational to political to musical. You have choices.
mary bardmess (camas wa)
I hate being marketed. I know, buying and selling makes the world turn, but when I walk to the library to search for and check out a book to read, life feels good.
Ken (New York)
This is spot on right. Consuming information on the internet requires people to be conscious about what they're reading and watching.
Studioroom (Washington DC Area)
It’s not that simple. What about the content being created and added to YouTube? There is a great quantity of garbage being put on YouTube. From copywrite infringement to crazy conspiracies. I have met people who believed in the Jade Helm story, crazy crazy stuff. Who is creating this content and why? That’s what’s important to know. It’s hard not to believe that there isn’t any (profit) motive for creating this content.
ChrisQ (Switzerland)
The recommender algorithm tries to guess our brains desire. It shows that our brains want extreme content such as far left or far right content and conspiracy theories. Its that simple. Youtube is just as bad as humans are.
Laird Wilcox (Kansas City, MO)
This is a sophisticated and rational-sounding argument that could have been used by any oligarch, dictator, authoritarian or totalitarian regime through history that was capable of compiling it. The essence of the argument is that people are too stupid to make their own decisions, that media must be controlled in order to keep power in the hands of the elites who "know better," and that the "natural tendencies" of human beings are subversive to the established order. This argument largely comes from the writings of Cass Sunstein who has written extensively on the need for censorship, media sabotage and "cognitive infiltration" of information sources to ensure that people have politically correct values, opinions and beliefs. He might regard the sentence above as an example of a "conspiracy theory" when in fact it's a statement of easily determined fact. Unless Americans -- and people everywhere -- are able to retain a free and uncensored media they are prey for exploitation and submission to elites able to utilize the control rationale outlined in this article. While it is true that human beings generally may make mistakes, such as too much fat and sugar in their diet, it is specious to suggest that this implies that their political judgments are in need of state control. The tendency to investigate, question, examine, probe, search for motives and so on is a key characteristic of good investigative journalism. To deny this to the general public is unconscionable.
Greg Harper (Emeryville, CA)
Is sounds like unaccountable political money had the same effect on You Tube as it did on the Congress and elections in general.
james (portland)
Here's a two-part solution: 1) reinstate the Fairness Doctrine, and 2) Google, Facebook, Youtube, Twitter, etc, ... are news agencies or media outlets that need regulation.
Joe (Iowa)
The problem with fairness is someone has to decide what's fair. I'd rather be free to choose what to watch.
ando arike (Brooklyn, NY)
So our "natural tendencies need to be vigilantly monitored," lest our "curiosity lead us astray" down the "rabbit hole of extremism." Am I the only reader who hears totalitarian overtones in these phrases? The only reader who wonders about which government agency will be responsible for determining which videos and websites need to be censored? So who is to be the judge of which ideas are "extremist"? Is that what the NSA is gearing up for, building out its storage capacity? It's an astonishing feature of our particular political moment that such ideas are offered by a putative liberal, an information specialist opposed to Donald Trump's "fascism."
mary bardmess (camas wa)
I sincerely hope you are the only reader who hears "totalitarian overtones" because it is seriously paranoid. Between the private sector and the public sector I will continue to prefer the one that allows me to vote for representation. At least then there is some hope of breaking up the giant trusts. With any luck and a lot of effort we can elect a government that will represent the public's interests, one that is willing to regulate and even break up the power of these mega-corporations.
Dominic (Minneapolis)
The author is asking a simple question: what kind of culture do we want? In the past, the point of human culture was to knit the group together and to help insure our mutual survival. This is not the point of culture, which seeks to exploit and fool its own members for maximum profit. If that's OK with you, fine. Some of us would like to live in a truly pro-life culture.
David Potenziani (Durham, NC)
While YouTube and its parent Google and its parent Alphabet make money on our clicks, the purveyors of radical content make money as well. There are easily-found lessons online, courtesy of Google and (of course) YouTube, that offer ways to monetize your video stream. The fact that those who create and offer these radical views are also incentivized and enabled to make more. The money they receive from YouTube only makes their knives sharper.
Pete Haggerty (Canton ,CT)
Just one more example of policy driven by greed and the exploitation of fear and loathing in America.More and more I feel like lemming being swept along.
Turgan (New York City, NY)
To my understanding Zeynep Tufekci's article suggests us to consider YouTube as in the same category of the financially most successful junk-fast food restaurant chain in the world, hence let's pressure YouTube to change its (algorithm's) recipe to a healthier direction. Shouldn't we leave YouTue's business to YouTube and instead encourage, expect and support more competitors of YouTube with healthier algorithms to emerge, where are they? Until then it seems we are all at the mercy of YouTube's coders, with supports from algorithm ethicists.
West Coaster (berkeley,ca)
Ugh. The algorithms are messing with our heads, man. Not cool. As someone who used to write computer algorithms for a living, this is a simple, and obvious technique, the Youtube feedback loop will keep sending the user into a narrower and narrower, and by definition more deviant, bandwidth of choices. It's not "Artificial Intelligence", it's really the opposite AI "Artificial Ignorance". Not. Cool.
FXQ (Cincinnati)
Gee, thanks "mom' for telling us we need to eat our vegetables so we can get some good, nutritious "mind" food from MSNBC, CNN, and FOX and the rest of the wholesome establishment media? Heaven forbid people should get radicalized. No thanks, I'll stick with my progressive YouTube programming like the Jimmy Dore Show, Secular Talk with Kyle Kalinski, The Humanist Report and TYT.
WOID (New York and Vienna)
Help! All I wanted was a nice version of Beethoven's Waldstein Sonata, and before I knew it Google was feeding me Opus 109 and the Diabelli Variations, breaking the Richter Scale, as it were. "Natural human desire." "natural tendencies." It's how two=bit sociologists avoid looking at the roots of human desires and action.
Tabula Rasa (Monterey Bay)
Accelerant, propellant and booster stage to move the shock and awe with a dollop of disgust to the stratosphere. Drama is as drama does and YouTube makes it mightier than the sword.
Tom (South California)
I usually don't go to you tube, to much silly stuff. If a news story has a link that helps to explain a an important point then I'll look.
R. Adelman (Philadelphia)
Restaurant managers are right. They are merely serving us what we want. I suspect YouTube's algorithm is programed to direct people toward the place that others who have watched a certain video went. People who watched pro-Trump videos also watched white supremacist videos, so the algorithm assumes they are birds of a feather. I don't think it's so much a matter of YouTube curating videos and deciding who gets what as it is their using "trending" and crowd sourcing as their criterion. So the customers themselves are influencing what YouTube recommends next, and YouTube users reinforce the algorithm by accepting and pursuing the suggestions. Therefore, YouTube IS merely serving us what we want... As for me, since I am a Netflix user, I know how stupid algorithms are, and how laughable the suggestions Netflix offers are after you rate a film, so I don't trust algorithms and their recommendations. I just chuckle and move on. Artificial intelligence, beyond fact-based data, is pretty useless, and political opinion, like movie reviews, resides in the abstract theoretical sphere. Google just rolls out what is trending.
Paul Shindler (NH)
Important piece. Frightening.
pjc (Cleveland)
Stop the algorithm, I want to get off.
CAG (San Francisco Bay Area)
Let's be frank about all of this. We reap what we sow. If this is the best of which we are capable, enjoy the ride. We aren't going to make people smarter and honestly, there is no governmental entity that has the capacity to manage our own stupidity. We get what we deserve. Welcome to the world of Donald Trump.
J. Cornelio (Washington, Conn.)
"In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal," the author writes. Well, if human beings are idiotic enough to consume excessive amounts of "sugary, fatty foods," we deserve what we get. This is a failure of an educational system which seems incapable of teaching us, from kindergarten on, what makes-us-tick.
JR (CA)
The symptom, not the disease. At the earlieast possible age we must teach critical thinking which these days might be called cyncial thinking. Example: If your source of information feels the need to tell you they are fair and balanced...they are not fair and balanced. Most politicians are trained as attorneys and we all know the joke about lawyers lying because their lips are moving.
Scott Weil (Chicago)
Oh, its worse than you think. Zyclon B was originally designed as a pesticide, then used to gas us pesky Jews. Al Nobel came up with a way to break up rock formations so engineers could build roads. Dynamite. Social media originally was going to flatten the earth, connect people from Chicago to China. Along the way, we collected every transaction, or click. We now have 3,000 attributes for every social media user. So we can find you when we need to. When you hover over a product but don't buy it. When you checkout, but don't complete the transaction. We will find you, and try to understand why you didn't buy. We can even scan your contact list and your network and via AI make it appear that one of your friends is contacting you about that purchase, you know, organically.
barry napach (unknown)
Google does what all corporations do which is maximize profits,cigarette corps. develoed cigarettes too be more addicting and seeks for more clicks on utube thereby incerasing ad revenue,them people at Google know how to appeat to people
Eli (Tiny Town)
If you look at how some of these fringe viewpoint videos are tagged, especially their metadata, it’s obvious this is just the video version of how click bait scams work. They’re just using main stream keywords on your metadata to promote fringe content. The algorithms don’t know its a lie. The youtube engine is literally just giving you videos with the same tags you started with just like regular search. A few months back the NYTs ran an article about disturbing videos in the kids section — same principle at work. How we process searches is broken.
NeilG1217 (Berkeley)
The problem of media shocking us did not start with YouTube. Newspapers do it all the time, sometimes with the stories and sometimes just with headlines that make stories sound more shocking. Television does it, too. What's changed is the monopolization of our information sources. Social media have a scope of distribution that television and newspapers never even dreamed of. It's time to break up Google, the way that Standard Oil was broken up. We are probably not ready to regulate its content, but we can regulate its business entity. Some of the successor entities may use the same algorithms, but perhaps some of them will not, and we might get the "marketplace of ideas" that justifies our freedom of speech. If that's not legally possible, here's another option: Give consumers control. Require Google to warn customers about the tendency of its algorithms, so we can make informed decisions about what we watch next, instead of just taking what they give us.
John (Central Florida)
"Newspapers do it all the time, sometimes with the stories and sometimes just with headlines that make stories sound more shocking. Television does it, too." The difference is that newspapers and TV have standards tied to public taste, lowbrow though they may be, nonetheless, there are limits to what passes as acceptable content. I didn't read in the newspaper or see on TV that the Newtown mass killings or other mass murders were hoaxes staged by paid actors. The reason is that such conspiracy theories won't pass even the low standards set by any channel or almost all print media. YouTube changes the game and makes loads of money doing it. It's completely unacceptable and undermines reasoned discourse. It helps create a lot of distrust and animosity, particularly among people who don't possess the tools needed to distinguish between tripe and factual evidence. This is an important op-ed. Other than serious pressure on Google to change the algorithms, I don't know what to do due to first amendment concerns.
DornDiego (San Diego)
The article's not talking about headlines, it's talking about the story content, and fake news is invented, not reported. Journalism exists.
Marc (Utah)
I think this is a very complex issue. Yes, YouTube's AI seems to have learned something profoundly true and ugly about us: that we crave being shocked. But, as one commenter put it "do we have to follow YouTube's recommendations?" The answer remains no, and regulation seems to me not the right answer. YouTube is a Democratic forum, in the sense that anyone can create content. Of course any astute content creator looking to monetize their production will quickly realize that extreme content equals more monetization, but in the end, as sad as it may seem, YouTube's algorithm and the creator are reacting to us. That's right, you and me. Regulation isn't the (only, or necessarily right) answer, but instead a mix of awareness and education about how to best use YouTube for a balanced content diet. We need to mobilize school boards and educators for this, without turning into luddites.
rwanderman (Warren, Connecticut)
Great article but the problem is more pervasive than just recommendation algorithms it's also in the simple act of "liking" something or reporting something on Facebook or twitter simply to gain in popularity. In other words, it's not just algorithms working on us, its us interacting with them. For example, people post "Fail" videos: collections of people doing insane things (what used to be called Darwin Awards) but of course, once that genre got popular people felt the need to up the ante and do more insane things to get eyeballs. That's not an algorithm problem, that's a people problem. When Google or Netflix makes a recommendation, we don't have to follow it, do we?
james (portland)
"When Google or Netflix makes a recommendation, we don't have to follow it, do we?" Have to? Very philosophical--Ask yourself why alcoholics remove alcohol from their homes?
Raindrop (US)
We don’t have to follow it, but it is often set up to default to automatically play the next video. You have to look up how to turn this off.
rwanderman (Warren, Connecticut)
Turning off autoplay, at least on my browser (Safari running on MacOS) is a matter of a simple toggle on the youTube menubar. That said, it seems to wiggle its way back on without me touching it so you're certainly right about that.
lark (San Francisco)
Google supposedly has the motto, Don't Be Evil. It seems to me they have failed, and their influence has taken on an alarmingly destructive cast. It is not because the employees intend this, but they have been dreadfully slow to practice vigilance on the impact of their technologies.
Janet michael (Silver Spring Maryland)
Who knew that there was a recommender algorithm?This should alert us all who dip into You Tube for entertainment.We need to be cautious that we are being led to places we don't want to go and glibly accepting "alternative facts". This is another reason to take a vacation from our electronic devices.
FXQ (Cincinnati)
Meanwhile, you are subjected to thousands of ads that have been scientifically marketed to appeal to you and manipulate you. But YouTube is a scary place.
ThirdWay (Massachusetts)
There is only one solution for this. Youtube, Google, and their ilk must be regulated as public utilities. They have given up any pretense of acting in the public interest. If you argue that I am advocating the suppression of free speech, I would counter with the "Does free speech mean that you have the right to yell Fire! in a crowded theater?" argument. YouTube's business model is to yell "Fire!" In the theater and to then collect a toll at the exits. Historically any change in technology has required an appropriate regulatory response to insure that the new technology is not used to advance the few at the expense of the many. This is no different.
Rocket J Squrriel (Frostbite Falls, MN)
Say Mark Zuckerberg does as he's hinted and runs for president in 2020 what about Facebook? Even if he steps down all the people he appointed to run the place are still there. How do you know they wouldn't twist things behind the scenes to favor him?
iain mackenzie (UK)
I am just a little wary of over-regulation ... doesn't that mean that you are trusting some other entity with the power and handing over personal responsibility??? I would more strongly advocate greater awareness by (or education of) the 'audience'. That way, when someone shouts 'fire', they can assess for themselves if it is genuine and make an appropriate choice.
Observer (Canada)
China's leaders already done one step better. They banned Google, Facebook & similar Radicalizers completely from the Chinese internet.
Scott Fraser (Arizona State University)
I think YouTube is the greatest thing since sliced bread. It's a game changer and it's really the only thing I watch since I cut the cable 4 years ago. I disagree with the article writer's perspective in that YouTube gives us choices, vice the nonsense on the Big 3 and Fox "news" channels who control what gets pumped into our brains.
ArtMurphy (New Mexico, USA)
It seems clear that every democracy needs to rethink their policies on what limits should be placed on speech (e.g. shouting “fire” in a crowded theater) in this new era of social media, bots, hacking, and organized misinformation. We need to protect the public forum from cyber warfare and all its works and all its ways, including for-profit entities which place profit above civic good. Whether you call it “corporatocracy” or fascism, the threat posed by detailed corporate knowledge about each of their customers gives tremendous power to gigantic, unelected, profit-driven “persons” who do not necessarily put the well being of customers ahead of profits. Once democracy has been eroded it is very difficult to get it back. History consistently shows the consequences of saying, “ It can’t happen here”.
FXQ (Cincinnati)
Yes, let's leave it up to either the tech corporations or governments to filter what we are exposed to. Please, protect us from unwanted speech, books, music, movies, and thoughts. How about this?- Stop the tech companies from using algorithms to manipulate what we watch.
Ray (North Carolina )
Profit motive trumps every thing else
Rich Lampert (Philadelphia)
What is a more appalling possibility: That YouTube has designed its algorithms to lead users toward more extreme content, or that the AI underlying the recommendations has arrived at this strategy autonomously?
iain mackenzie (UK)
Weren't our parents saying the same kind of thing about TV when we were young? (the "goggle-box"; the "devils lantern "...) Maybe we learned nothing about the power (or the impotence) of the media in the past 60 years...
Chase (US)
The former is far more appalling. The latter is easy to fix: it is easy for Google to reset the objective and constraints of its AI engines. Google is responsible and accountable either way. But the former possibility suggests a turn to the Dark Side. We aren't there yet. The latter suggests an accidental misuse of their technology for a pure marketing objective, which the company can change to incorporate their broader objective of Don't Be Evil.
sam (ma)
We must get Google's Chromebooks out of the hands of our children in public schools. There's a good start. Google owns everything your child inputs into those devices. All of it. And students are required to use them in class after class. Every parent should have to legally consent with their children's usage of Chromebooks in schools. I for one would not 'click I agree'. YouTube is like crack for kids.