Hieronymus Bosch, The Garden of Earthly Delights: Hell (1490-1500) (detail). Museo del Prado, Madrid
(Read Part One of this post here!)
The Security and Exchange Commission complaints filed on behalf of Facebook whistleblower Frances Haugen raise the question of whether Facebook’s awareness of the flaws in its systems amounts to defrauding investors; Senator Maria Cantwell asked in her Senate questioning of Haugen whether the company was also defrauding advertisers. “Aren’t we now talking about advertising fraud?,” she asked. “Aren’t you selling something to advertisers that’s not really what they’re getting?” She went on to say that by exploiting the virality of emotionally compelling but unverified content Facebook secures an unfair advantage over traditional news sources that are bound to a standard of demonstrability and serving the public interest and can be sued for publishing something untrue. Frances Haugen responded that in her documents were multiple questions from advertisers after violent or disturbing incidents like the January 6th insurrection and talking points for responses from Facebook asserting that the platform was doing “everything in their power to make Facebook safer, we take down all hate speech when we find it.” Frances Haugen observed that Facebook knows they only capture 3 to 5 percent of hate speech. (The SEC complaints also say Facebook misled advertisers by overcounting users they knew to have multiple accounts, many of them teenagers eluding parental scrutiny).
Some commentators think that Frances Haugen’s emphasis on the harms of algorithmic ranking is overstated. I would like to know more about this argument. Many inside and outside Facebook note that there are other powerful sources of misinformation and polarization in contemporary society. This seems to me spurious. You don’t decline to put out a dangerous fire because there are other dangerous fires; and even as the doubters say this they acknowledge Facebook’s extraordinary reach (“Facebook is the internet in Africa,” Frances Haugen). Tech observer Casey Newton, who wrote that “I found it surprising that Haugen’s pet issue is feed ranking: I just don’t believe it’s as powerful others seem to” and that “polarizing and harmful content was often shared on Twitter and Instagram during the many years that those services used reverse-chronological feeds,” also writes, “Facebook is an all-time great business because its ads are so effective in getting people to buy things. And the company wants us to believe it isn’t similarly effective at getting people to change their politics?” More pointedly, Daphne Keller of the Stanford Cyber Policy Center tweeted last April that “a rational ad-revenue driven company will not want to ceaselessly amplify content that pushes our buttons in the short term but leaves us feeling icky in the long term.” Frances Haugen addressed this claim directly. She argued that the company is plagued by an addiction to short-term metric growth at the expense of its long-term business interests, and that a Facebook that committed itself to combatting the harms that it is spreading “could be more profitable five or ten years down the road, because it wasn’t as toxic.”
She identified as sources of this short-term thinking the internal incentives that drive engineers and designers inevitably toward multiple small decisions favoring growth: Mark Zuckerberg “has built an organization that is very metrics-driven. It is intended to be flat, there is no unilateral responsibility, the metrics make the decision. Unfortunately that in itself is a decision.” She faulted the “siloing” of integrity researchers from product development and said, “integrity actions, projects that were hard fought by the teams trying to keep us safe, are undone by new growth strategies.” Other observers, like Will Oremus in The Washington Post, point to a corporate structure that “routes weighty decisions about content policy through some of the same executives tasked with government lobbying and public relations,” who resist unpopular disclosures and tough measures. Robyn Caplan, a researcher at the nonprofit Data & Society, told Will Oremus that she has found that online platforms’ struggles with content moderation have their roots in “organizational dynamics” built on early “start-up culture.” Frances Haugen has noted that many senior Facebook executives are early Facebook employees and have never worked anywhere else; they have no context for assessing the company’s performance and tend to fall for its “truisms.” She reminded Senators that Mark Zuckerberg, who founded Facebook in college, “holds a very unique role in the tech industry in that he holds over 55 percent of all the voting shares for Facebook. There are no similarly powerful companies that are as unilaterally controlled. And in the end the buck stops with Mark. There is no one currently holding Mark accountable but himself.”
Interestingly Frances Haugen often identified weaknesses in the security and integrity systems of Facebook, a $1 trillion company that makes $40 billion a year, to understaffing. She told Senators, “A pattern of behavior I saw at Facebook is that often problems were so understaffed that there was an implicit discouragement from having better detection systems. For example my last team on Facebook was the counterespionage team in the Threat Intelligence Org. At any given time our team could only handle a third of the cases that we knew about. We knew if we could build even a basic detector we would likely have many more cases.” At another time she said that she considered understaffing in the team, which tracks, for example, authoritarian countries using the platform for surveillance of their populations and other state actors, as a “national security issue.” She spoke of a “cycle of scandal” that depresses hiring. She told Senators that “Facebook has struggled for a long time to recruit and retain the number of employees that it needs to tackle the large scope of the projects that it has chosen to take on. Facebook is stuck in a cycle where it struggles to hire, that causes it to understaff projects, which causes scandals, which then makes it harder to hire.”
Try Book Post’s by-subscription books coverage!
Informed book talk, to your in-box, $5.99 for one month
Whether intentionally or no, Frances Haugen framed her disclosures to be amenable to building bipartisan consensus on reform, which until now has been elusive. She first released her trove of documents through the Murdoch-owned Wall Street Journal. Her history in the counterintelligence unit addressing authoritarian threats was sympathetic to many in the Senate’s Republican minority. Locating Facebook’s harms in the mechanisms of amplification itself rather than what gets amplified sidesteps conservative concerns about censorship: Frances Haugen replied to questions from Republican Senator Cynthia Lummis that she is “a strong advocate for non-content-based solutions.” “I’m not sure that it is the heavy hands,” Senator Lummis reflected, “like breaking up companies or calling them a utility [that are the answer], which is why your approach … is intriguing to me.”
Indeed Frances Haugen has said she does not favor breaking up Facebook for reasons that I find completely unconvincing. She says that it is too predominant and needed around the world, that the algorithmic model, because it is profitable, would persevere in the legacy companies, and that she learned at her earlier job at Pinterest that advertisers are only prepared to learn one or two platforms at a time. She seems to be saying that the world needs one overwhelming company to manage its digital access and that, if it were properly regulated, the lack of competition would not be inherently problematic. It seems a strangely paternalistic view and trusting view of corporate dominance. She does not really consider that the insularity and lack of accountability endemic to Facebook might in part spring from Facebook’s monopolistic position, or that, as tech commentator Charlie Warzel has argued and much of her testimony confirms, the company has become too big to run. One thinks also for example of Senator Amy Klobuchar’s complaint that tech monopolies prevent more representative businesses from rising up to challenge these corporations run and staffed largely by white men from the Ivy Leagues. Perhaps she is setting her sights on achievable goals.
Like many who watched these revelations unfold, I was made hopeful by them on account of Frances Haugen’s measured focus on the empirical and her command of the systems involved. Although we are certainly far from arriving at solutions for taming these monsters that have grown among us, we are getting closer to being able to discern the scrim that they cast over our eyes. The harms Frances Haugen identified, which, as she often says, she believes were unintentional, are so instructive about how quickly machine-driven systems depart from being neutral arbiters and become carriers of unintended messaging: by assembling the voices of “the mass,” which of course is an illusion and a construct, we hand our powers of communication over to a goblin born from our most unregulated impulses.
Even when we read seriously, one wonders, how much of what we know to look for has come to us through an engagement-based ranking or a targeted-ad-driven source? How much—like the European political parties who admit to having adjusted their policies to suit Facebook’s promptings—might what we read even between covers or on the page be being shaped in intangible ways by the incentives built into the mysterious multipliers that give them public attention? These systems of moderating what we see have become so pervasive that they are barely visible to us, present for example in Google’s search ranking and Amazon’s recommending.
Some readers have asked me if they should “get off social media.” I don’t think that’s quite the question. We should advocate for meaningful constraints on social media and other tech monopolies (Amazon, Google, Apple) and public policies that sustain thoughtful and informed discourse. We should support institutions that hue to fact-based inquiry. But mostly, internally, we should learn, constantly, to fine-tune our awareness of how we see what we see, and update the engines of our attention to home in on what is true, or genuinely struggling to be true.
Ann Kjellberg is the founding editor of Book Post. She worked as a book review editor at the New York Review of Books from 1988 to 2017, founded the literary magazine Little Star, and is the literary executor of the poet Joseph Brodsky.
Book Post is a by-subscription book review service, bringing snack-sized book reviews by distinguished and engaging writers direct to your in-box, as well as free posts like this one from time to time to those who follow us. Subscribe and receive straight-to-you book posts by Joy Williams, Marina Warner, Reginald Dwayne Betts, Àlvaro Enrigue, more!
Seminary Co-op and 57th Street Books are Book Post’s Fall 2021 partner bookseller! We partner with independent bookstores to link to their books, support their work, and bring you news of local book life as it happens in their communities. We’ll send a free three-month subscription to any reader who spends more than $100 with our partner bookstore during our partnership. Send your receipt to info@bookpostusa.com.
Follow us: Facebook, Twitter, Instagram
If you liked this piece, please share and tell the author with a “like”