
Discover more from Book Post
The Temptation of St. Anthony, Joos van Craesbeeck. Staatliche Kunsthalle (ca. 1650), Karlsruhe
Apologies, dear readers, if I arrive at the Facebook whistleblower revelations after you have had your fill of the subject. I have been thinking a lot about their implications for journalism and the life of writing and ideas; I hope it might be worth peering into them a little longer with this in mind.
Whistleblower Frances Haugen is a data engineer and tech executive who worked from 2019 until last May in the Civic Integrity and Counterintelligence units of Facebook, having previously worked in algorithm design at Google, Yelp, and Pinterest. On September 13, collaborating with The Wall Street Journal, she began anonymously releasing a huge trove of documents from within Facebook testifying to a pattern of prioritizing growth (i. e., earnings) over user safety and the public good. In short order she revealed her identity on 60 Minutes, filed a Security and Exchange Commission whistleblower complaint against Facebook for misleading investors, and testified before a Senate subcommittee.
The revelations broadly covered five areas: internal Facebook research demonstrating harms of Facebook subsidiary Instagram for teenage girls; the exposure of populations outside the English-speaking world to content not adequately moderated by Facebook’s integrity and security systems; failure to exclude human traffickers and other criminal enterprises from the platform; exempting famous people from user rules; and ignoring internal evidence that the algorithmic mechanism that choses what you see on your feed promotes misinformation and incitement to violence and hate.
The example in the last of these categories that got the most attention was complaints to Facebook from political leaders that they had been obliged to skew their messaging in a negative or provocative direction in order to get visibility on the platform. But most of that story itself was devoted to journalism. The Journal cited correspondence with Facebook from Buzzfeed news chief executive Jonah Peretti saying that the incentives created by Facebook’s systems for making posts visible to users were obliging reporters to surface what the Journal called “divisive content,” “reorienting their posts toward outrage and sensationalism.” Jonah Peretti told Facebook that his staff felt “pressure to make bad content or underperform.” As Frances Haugen said in her Senate testimony, “Companies like Buzzfeed wrote in and said, ‘The content that is most successful on the platform is the content we’re most ashamed of.’” (Peretti noted that it was not just provocative rhetoric that benefitted from Facebook’s algorithmic promotion, but “‘fad/junky science,’ ‘extremely disturbing news,’ and gross images.”) The incentives of the system were starting to shape the outside word that it reflects. Even here at sleepy little Book Post we get the advice to stoke controversy to raise our visibility on social media. With the spread of engagement-based ranking it has become exponentially harder for a small operation like us to be seen at all without catering to social media’s hunger for provocation.
We rely on subscriptions to buck the targetted-advertising system
Become a paying subscriber to receive straight-to-you book reviews
Later this week: Joy Williams on W. G. Sebald
Ranking on social media that privileges “user engagement” has been creeping up on us since Facebook launched their popularity-driven news feed in 2006. Twitter and Instagram, until 2016, showed you posts as they appeared from the accounts you followed. But engineers soon realized that they could keep you on the site longer (and expose you to more ads) by showing you things you were more likely to be interested in, based on your past behavior and what was working with others. Facebook took this trend step further with the announcement in 2020 of their “Meaningful Social Interaction” (“MSI”) algorithm change, that privileged relationships between users over professionally prepared content. The adjustment was partly driven by complaints about Facebook’s role in the 2016 elections. CEO Mark Zuckerberg told the public that the company’s aim was to “strengthen bonds between users and to improve their well-being,” and indeed internal research was showing that people were more depressed and alienated by “passively consuming information” as opposed to interacting with real people. Although Mark Zuckerberg “framed the shift as a sacrifice,” Frances Haugen’s disclosures show that executives were also responding to waning numbers in their own metrics. The engineers saw that having those “systems of little rewards,” in Frances Haugen’s phrase, of your posts being liked and shared and commented on by other users, keeps people on the site longer and motivates them to post more, creating more free content, which means more ads, and more money for Facebook.
Frances Haugen also explained that Facebook knows that most of their content and engagement come from a few highly engaged users. Facebook also knows that most misinformation is spread by a small pool of such users. Writing in The Atlantic recently Renée DiResta, technical research manager at the Stanford Internet Observatory, coined a new phrase, “ampliganda,” to describe how marginal actors with large followings, or skill at connecting to large followings, scheme to manufacture viral posts and “make certain ideas seem more widespread than they really are.” “The crowd,” she explained is “motivated by ideology, but also the camaraderie of participation and the potential for recognition.” Frances Haugen said in turn that Facebook knows that the people who are exposed to the most misinformation are people who recently widowed or divorced or moved to a new city, or are isolated in some other way; she said that her Civic Integrity unit studied the “misinformation burden” on such users, which “erodes their ability to connect with the community at large, because they no longer adhere to facts that are the consensus reality.” Frances Haugen also told Congress that “Facebook knows that their integrity systems stop working when people are exposed to 2,000 posts a day.”
Employing a system that promotes posts based on the number and intensity of their likes, comments, and reshares, and knowing that content that makes you shocked or angry is more likely to generate such engagement, among normal people but especially among the small group of highly motivated users most active on the site, readily moves regular users from innocuous to extreme content with a few clicks, as Haugen, the Senators, and other researchers have noted (some examples: spread of disinformation from Eastern European troll farms, radicalizing content in Germany, recruitment for the QAnon conspiracy, girls drawn from searches for “healthy recipes” to anorexia advice). The Wall Street Journal in 2020 cited an internal Facebook report that found that “64 percent of all extremist group joins are due to our recommendation tools.” The weighting in favor of “downstream MSI,” posts that had been multiply reshared, in particular was found internally to “make the angry voices louder.” Frances Haugen repeatedly reminded the oversight committee that Facebook has “admitted in public that engagement-based ranking is dangerous without integrity and security systems,” yet did not implement those systems to most of the languages in the world, and in any case “overly rely on artificial intelligence systems that they themselves say will likely never get more than 10 to 20 percent of content.” The much smaller share of oversight conducted by humans has also been faulted for inflicting traumatic harms on the people who staff it.
So, after the big tech firms had sucked away all of journalism’s advertising dollars and Facebook had dragged newspapers, magazines, and news websites through a series of increasingly disadvantageous financial deals and contortions to conform to its unavoidable platform (60 percent of all internet-connected people on earth, per 60 Minutes), the last scraps of attention legitimate journalism has been able to muster, Frances Haugen confirmed, must be eked out of a platform overwhelmed by signals directing users away from legitimate news … [Read Part Two of this post here!]
Ann Kjellberg is the founding editor of Book Post. She worked as a book review editor at the New York Review of Books from 1988 to 2017, founded the literary magazine Little Star, and is the literary executor of the poet Joseph Brodsky.
Book Post is a by-subscription book review service, bringing snack-sized book reviews by distinguished and engaging writers direct to your in-box, as well as free posts like this one from time to time to those who follow us. Subscribe and receive straight-to-you book posts by John Banville, Mona Simpson, Brian Fagan, John Guare, Emily Bernard, more!
Seminary Co-op and 57th Street Books are Book Post’s Fall 2021 partner bookseller! We partner with independent bookstores to link to their books, support their work, and bring you news of local book life as it happens in their communities. We’ll send a free three-month subscription to any reader who spends more than $100 with our partner bookstore during our partnership. Send your receipt to info@bookpostusa.com.
Follow us: Facebook, Twitter, Instagram
If you liked this piece, please share and tell the author with a “like”
Notebook: The Facebook Files, Writing, and Journalism (Part One)
Look forward to Part 2 and thanks for thinking through all of this with us.