How Facebook’s Oracular Algorithm Determines the Fates of Start-Ups

Nov 02, 2017 · 26 comments
Martin S Friedlander, Esq. (Los Angeles, Ca. 90024)
Can it read my mind. I doubt it. What do I like.
Michelle (Minneapolis)
Wow. Thanks for the fascinating article. Mind. Blown.
Steve Bellevue (Oakland CA)
What an amazing game. As a target, is it in my best interests to participate or not? This seems like good small forum to ask this question. Though I said to myself a couple of years ago "Embrace Everything", I've never been on Facebook, quit LinkedIn and have adblock. I don't feel like I am missing anything, though as I read this it does seem more than a little Luddite-like. I read several newspapers on line and search on Google. What do the forces of the internet know about me, why should I care and what am I missing.
Jack Cerf (Chatham, NJ)
To simplify greatly, the dominant principle of the Facebook ad algorithm seems to be "find out what they like and associate your product with what they already want to hear." Good marketing for both business and politics, but hardly something that builds consensus.
Dawn (Chicago)
I use ad blockers that prevent most ads from appearing in my feed. A few sneak through, but I disregard them, much like the ads that my eyes skipped in the newspapers I worked for.
OSS Architect (Palo Alto, CA)
Facebook has 5 million advertisers and 2 billion users. That's a matching problem on a large scale. The target ad has to be selected while FB servers are generating the next FB content web page. It's recommendation engine has to update it's ad response history in real time. That's a lot to do in one second. Think of it has having to hit a bulls eye, when you don't have enough time to aim. Despite the recent improvements in technology and algorithms, there is a fundamental problem with building recommendation software: human beings. The Romans figured this out 2,000 years ago and left us the warning, in Latin, "De gustibus non est disputandum". Facebook, netfix, spotify, et al., are trying to figure out what you like. It's technically called collaborative filtering, and it works like this: A likes B and C. D likes B, so what is the probability that D likes C? That's hard enough to do with movies and music. Now try it with products that don't exist yet, or nobody has seen. Recommendation engines have intellectual property value. They are also developed for companies that provide the data. It's their data, hence their legal departments insists it's their algorithm. What I thought was going to be a lucrative consulting career for me, didn't materialize, because every contract I got stated Company XYZ owned both the data and the output, and I could not use any of it in my next engagement.
Winston Smith (USA)
America excels as a place to sell products, entertainment, services, promises, lies, hate, hope or fear. After the clicks, marketing, algorithms and buying or believing are over it isn't so good at creating a national community grounded in qualities and responsibilities of citizens that don't appear in digital spreadsheets.
mpcNYC (NYC)
Factory workers won’t be the only ones losing their jobs to robots. Goodbye Madison Avenue, hello algorithm.
Matt watson (Vancouver, B.C)
What I find fascinating about this piece is that a company in Vancouver B.C. created this same business in 2000. They do a few hundred million in sales each year. Did these guys really not come across this when the dreamed up their idea 15 years later? Not that I care if the copied (credit to them).
Ronald Tee Johnson (Blue Ridge Mountains, NC)
In 1973 I took out a full page ad in GOLF magazine for $10,000 and offered two free rounds of golf at 80 golf courses from Sugar Mountain, NC to Miami, FL for only $35. The magazine made it possible to target only east coast subscribers. The ad brought in 6,000 memberships in about four months and there was more expense in handling the memberships. Today, I would spend maybe a $1,000 on Facebook with focus on suburban males in specific cities from New York to Florida. Handling the memberships could be handled without paper. Today, 160 rounds of golf at great golf courses for $35 appearing in Facebook news feeds? I've always wanted a Gulfstream.
Kristin Hudson (Palo Alto, CA)
Do it!
Elizabeth (NYC)
This article stayed away from the political side of Facebook advertising, but that's the scary elephant in the room. And nothing I read here gives any comfort that the company will be able to keep their product from being used to undermine the electoral process, despite their recent contriteness over the 2016 election. We may yet find out that Jared Kushner and Cambridge Analytica gave the Russians the data they'd amassed about potential swing districts. But it's entirely possible that the Russians were able to figure it out on their own. The implications of social media's ability to target and manipulate people and their actions is something we are only beginning to understand.
Eater (UWS)
Data can not behave unethically, despite the claim made by Antonio Garcia-Martinez. Ethics are imposed by people with feelings--these days, feelings that are highly sensitive, on thin-skinned people, with burgeoning special-interest groups just created for their specific gripes, and the silly younger ones prefer socialism to capitalism because they don't want to work. Data and algorithms should not be filtered to accommodate such foolishness. If one wants to target Trump haters, the same people who claim data can behave unethically hurting their feelings would rejoice in their double standard. When's it gonna end? MAGA?
OSS Architect (Palo Alto, CA)
Much of AI is implemented based on language, and human languages have biases, ethics, and "values" based on the frequency and proximity of words in a sentence. AI, specifically machine learning, encodes "meaning" from what it parses in human generated content. "White Trash" is significant to a machine, because these two words are seen in frequent combination in human speech and writing. In convolutional neural networks, the AI mechanism builds up "sets of features". A feature is anything "recognizably distinct". What are ethnic slurs? They're recognizable, they're distinct, that they occur consistently in human communication. This is were "political correctness" comes in. If we have no means to fix human language, then machines cannot basically understand us, and will over time (less than a few hours, in some AI bot experiments) become raving homophobic, racist, Limbaugh, Beck surrogates
tarchin (Carmel Valley, CA)
Well that's a whole bunch of assumptions you're ladling out there, Eater. You think algorithms write themselves? Ethics is a human matter, and as long as humans determine the way the data behave (algorithms) your argument has a few missing data points itself.
Tyrone (NYC)
You have a very fundamental lack of understanding of Antonio Garcia-Martinez position.
wfcollins (raleigh nc)
thank you for this article. very informative. keep writing in greater detail. time to write a facebook marketing book a la one of the greatest books about computer hardware: "the soul of a new machine" by tracy kidder. title: "you, collected, and sold, to you, by a new algorithm". as bill burr's steve jobs said: "get on it".
Bing Ding Ow (27514)
" .. Advertising has always been an uncertain business .. That's correct. For instance, 40% of Americans are NOT on FB, which is NOT mentioned, in the article. And which makes the case for how utterly ridiculous, all the complaining about "Citizens United" is. TV advertising works? Ask Eric Cantor, who out-spent Dave Brat 20+ to 1. Ask HRC, who outspent DJT. Ask Bill Gates about his "education reform" campaign in Colorado, with his 20+ to 1 spending advantage .. which lost.
Eli (Tiny Town)
A year ago I bought glasses in a physical store. I figured with a free eye exam buy one get one free 15% off that it wouldn’t be super expensive. It was 600$. Standard Steel frames. Ordinary lens, no bifocals of progressives. Only ‘upgrade’ I got was noglare (which if you plan on night driving at all ever is basically mandatory). Took four weeks to be delivered. My prescription changed this year and I ordered online. Unbreakable frames with a 5 year warranty. Progressive lens with prisms. No glare, scratch resistant coating, and anti-blue tint. 50$. Came in less than a week. If contact prices are anything at all like glasses prices, and I imagine they are, the market share for any start up offering sane prices is 100%. They dont need facebook ads. This company could stand a street corner with a sign that says save 95% on contacts today and sign people up.
Eater (UWS)
You ordered from the Hubble people?
Mike Smith (L.A.)
No Eater, he ordered glasses, not contacts. How could you read his clearly written comment and fail to understand that? On the other hand, I have no idea what you were trying to say in your convoluted comment.
SR (Bronx, NY)
To be fair, Facebook is probably not the enemy there; that company was probably a new front of vile eyewear monopolist Luxottica, which happily raises eyewear prices and buys out those in the way. With the margin from their inflated prices, they could just buy Facebook, never mind their ad space!
Fernando Ardenghi (Argentina, Buenos Aires)
Personality Based Recommender Systems are the next generation of recommender systems because they perform far better than Behavioural ones (past actions and pattern of personal preferences) That is the only way to improve recommender systems, to include the personality traits of their users. They need to calculate personality similarity between users but there are different formulas to calculate similarity. In case you did not notice, recommender systems are morphing to compatibility matching engines, as the same used in the Online Dating Industry for years, with low success rates until now because they mostly use the Big Five model (OCEAN) to assess personality and the Pearson correlation coefficient to calculate similarity. Please remember: Personality traits are highly stable in persons over 25 years old to 45 years old.
Eater (UWS)
Considering there is no true scientific efficacy for personality inventories, I suggest you may not be right. Social science isn't a true science anyway and same for psychology. The results will be a simple variation on the theme of imprecision and lack of actual predictability of behavior.
OSS Architect (Palo Alto, CA)
Most companies developing algorithms from big data, use k-means clustering. The pearson correlation coefficient is one (euclidian) measure you can use to build your centroids, but it's not the only one, and it's certainly not the best. You need to decide how many clusters your data probably has. You basically guess at that. Yes k-means provides a method to mechanically identify candidate clusters, but you can "over-fit" the data. The more clusters you have, the more variance they account for; not all clusters may have variance that is normally distributed, and you could have small, significant, but undetectable sub-populations that adversely affect your model. Then there are people like me that frequently look at corporate jet sites, and the sellers of $50,000 watches. I get great ads to look at, but my one man dis-information campaign is costing some people serious money. If we all acted like this, on-line, the value of these algorithms would plummet.
tarchin (Carmel Valley, CA)
You're really on a roll! I think there are a number of social scientists who might find this jejune negativity interesting. Psychologists are having a field day.