Where were you during the great Facebook outage of October 2021?
I personally Googled “Is Facebook down?” before embarking on a strangely productive evening during which I tidied my flat, did some yoga, ate some micro-broccoli, worried about the debt ceiling, read “Fahrenheit 451” cover to cover and watched all nine episodes of “Squid Game.”
Mark Zuckerberg, on the other hand, was probably very stressed about his net worth draining away at a rate of about $1 billion per hour.
Zuckerberg was also probably a little stressed about Frances Haugen, who revealed herself last weekend as the whistle-blower behind the damning leak of tens of thousands of pages of internal documents.
Parmy Olson suggests that her leak could turn out to be the most important act in the platform’s corporate history: “Haugen’s document dump revealed what many suspected but couldn’t prove: that Facebook created more lenient secret rules for elite users, that Instagram made body issues worse for 1 in 3 teenage girls, and that Facebook knowingly amped up outrage on its main site through an algorithm change in 2018.”
Luckily for us, and Congress, Haugen came with not only information, but also solutions. Smart ones, too, according to Tae Kim and Parmy. To sum them up:
Order Facebook to stop engagement-based ranking algorithms.
Order Facebook to spend more on content moderation.
Establish an agency to audit Facebook’s algorithms and features.
Mandate regular disclosure for researchers.
What Facebook does with its time, money and code is important because Facebook Inc. is absolutely huge. Billions of monthly users are exposed to and potentially influenced by content thrown up by algorithms designed to boost engagement and ad revenue.
Across the whole family of apps, Facebook says it has about 3.5 billion monthly active users. That’s nearly half the planet! And even when it comes to the monetary value of Facebook, its market cap eclipses that of all but 16 countries. It’s not the only brand that looks like a nation-state, as Ben Schott points out. But it is one of the most ubiquitous.
Not only does its sheer size make it imperative for regulators to do something about Facebook, but it also makes that job way harder.
Take Step 3, for example: establishing an agency to audit Facebook’s algorithms. Cathy O’Neil makes a living auditing algorithms. Her usual process would be to consider who is affected, find out whether certain stakeholders are being treated unfairly, and suggest ways to eliminate or mitigate that harm. Unfortunately, this approach is too difficult to apply to the algorithms at Facebook. “They’re just too big. The list of potential stakeholders is endless. The audit would never be complete, and would invariably miss something important,” she writes. Luckily, she also has a solution.
The fact that a single company has so much influence over our digital lives is troubling, especially for those in the developing world. At no point has that been so clear as it was last week.